AEA365 | A Tip-a-Day by and for Evaluators

CAT | Prek-12 Educational Evaluation

Rachel Schechter

Rachel Schechter

My name is  Rachel Schechter, Director of Research at Lexia Learning Systems LLC, A Rosetta Stone company. At Lexia, our mission is to improve student literacy by leveraging technology to personalize learning and simplify the use of data to drive instruction. The research team at Lexia is committed to evaluating the efficacy and validity of our products, informing product design, partnering with customers to evaluate their implementations and student progress, and disseminating research findings and best practices.

A large part of my job is to develop dashboards and data visualizations to help communicate findings to evaluation stakeholders. In an effort to be more personalized in our reporting, I’ve been thinking a lot about the balance between scalability and customization – relying on templates vs. creating fresh content for each project or constituency.

Recently, one of the largest school districts in the country requested custom reporting for Lexia® RAPIDTM Assessment, my company’s online literacy screener. The request came through internal Lexia staff working with the district. Initially, I was told that they “just need the basic info” organized into networks of schools. I took our existing templates and mocked up what seemed like a small adjustment.

The following week I presented samples to the district leaders, and they said that the graphs and tables didn’t look like the report designs that they were used to. I shifted quickly and turned their attention to characteristics like format, levels of summaries (grade, network, district, school) and graph type (stacked columns, bar) to better understand what resonated with them. Then I asked for samples of their commonly used reporting so I could pull design elements that were familiar.

A few weeks later I presented the updated reporting to the district leader. She commented that she “saw her feedback” in the revisions and how “heard” she felt by our team. Success! She provided a final round of feedback related to color choice and ordering of groups – easy to adjust in time for the final report delivery. The training for all administrators is this week, and I feel confident that they will be able to use the information in the customized reporting to make instructional decisions at the network and school level — the whole point of assessment!

Lesson Learned:

No matter what you’ve heard from your colleagues or others involved in a relationship, always start with a needs assessment to start “from the beginning” with the evaluation end-user.  

Hot Tip:

Get samples from clients/customers of reports created by their internal team!  Using chart types and dashboard setups that stakeholders are familiar with will facilitate their understanding and support usefulness.

Pay attention to details, something as small as a color choice or the order of the items in a stacked bar chart bring meaning to the information. Check assumptions about those qualities along your design journey.

Note: Data in each image do not match, are not from a single district, and are not reflective of the district mentioned in the story.

First mockup of data and sample graph provided by schoolRevised graph and final graph

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Matt Linick, and I am the Executive Director of Research and Evaluation at the Cleveland Metropolitan School District (CMSD). We are excited that the American Evaluation Association is coming to Cleveland this year!

As many embedded researchers and evaluators in school districts know, we are often under-resourced and over-committed, so partnering with research organizations is an important tool for meeting the needs of the community. Last year, CMSD along with the Center for Urban Education at Cleveland State University and the American Institutes for Research formed the Cleveland Alliance for Education Research (CAER). This partnership is helping the district investigate questions about our work in social-emotional learning and school culture with our Humanware team, answering questions about our students that are English language learners with our Multilingual Multicultural Education team, and helping us prioritize our research and evaluation questions.

Rad Resource:

Creating a partnership between a local or state education agency and research organizations is hard work and can be overwhelming. Don’t Panic![1] Others have struggled with this task and there are organizations and resources available to you. One such resource is the National Network of Education Research-Practice Partnerships (NNERPP). Their website has resources, information, and guidance for those in education organizations and the organizations they partner with. The guidance they provide can also help serve as a template for partnerships that happen outside of the education landscape. For those new to this work, NNERPP provides a guide to building the foundation of your partnership and walks you through the important questions. For those that have already started partnering informally, but are looking to create a more formal structure (like CAER), sample MOUs, charters, and job descriptions are provided. They even have an Education Week blog that features researchers and practitioners and their reflections as they pursue this productive struggle.

Hot Tips:

Over six years ago, Cleveland embarked on Cleveland’s Plan for Transforming Schools. During that time, we have launched and expanded many innovative new schools and brought a diversity of options to Cleveland’s students. While at the conference, attendees will be within walking distance of several of CMSD’s high schools with exciting programs. MCSTEM is an exciting STEM school with three campuses located within a Fortune 500 company, a college campus, and the Great Lakes Science Center (next to the Rock and Roll Hall of Fame). The Cleveland High School for Digital Arts is located downtown and provides students with an exciting opportunity to master academic content through project-based learning focused on digital arts. Located in the same building is the Davis Aerospace and Maritime High School. Davis is a STEM school that focuses providing students with a curriculum that emphasizes college and career readiness through real-world opportunities in aerospace and maritime careers.

(http://www.mc2stemhighschool.org/) is an exciting STEM school with three campuses located within a Fortune 500 company, a college campus, and the Great Lakes Science Center (next to the Rock and Roll Hall of Fame). The Cleveland High School for Digital Arts is located downtown and provides students with an exciting opportunity to master academic content through project-based learning focused on digital arts. Located in the same building is the Davis Aerospace and Maritime High School. Davis is a STEM school that focuses providing students with a curriculum that emphasizes college and career readiness through real-world opportunities in aerospace and maritime careers.

We hope you enjoy our fantastic city, visit the wonderful entertainment options near the conference center, and learn some more about the exciting new things Cleveland is doing for students and families.

[1] To quote the wonderful Douglas Adams.

 

We’re looking forward to the fall and the Evaluation 2018 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Hi!  My name is Daina Lieberman, English teacher and International Baccalaureate (IB) Middle Years Programme (MYC) Coordinator at South Lakes High School in Fairfax County Public Schools, Virginia.  I am also a recent graduate of the Ed.D. in Educational Administration and Policy Studies program at The George Washington University.  Today I’d like to provide some tips on Project-Based Learning.

Hot Tips:

As an IB MYP Coordinator, I work with teachers in my building to create, implement, and assess performance-based assessments in all subject areas, including PBLs.  Project-Based Learning, or PBL, has become an important method of teaching and assessment in schools.  Instead of students being taught a unit and then creating a project, students are asked an open-ended, driving question that requires them to research and learn information to solve a problem.  Their final work may vary in form and content, but students need to collaborate, think critically and creatively, and conduct research, and demonstrate their understanding.

PBL sets up situations that allow students to solve real-world problems and create authentic solutions.  As adults, we solve our problems in the same way—if we want to buy our first house, we conduct research, ask professionals for help, take action, reflect, make adjustments, and hopefully purchase a home successfully.  Teachers need to guide students throughout their inquiry phase to ensure they are learning appropriate and factual content relevant to solving the problem and answering the driving question.

PBL is a great way to enable English language learners, special ed students, advanced students, and all other students to demonstrate their learning in ways teachers can assess and students can enjoy.  This type of assessment can be used with students at any level, including undergraduate and graduate.

Be sure when assessing PBL work that your rubric is assessing student learning, not behavior or completion.  Check in with other teachers who have conducted PBL units and look at various rubrics before creating one; ask a colleague to look it over to ensure you are assessing what you want to assess.  You can also work with your students and have them help you create a rubric to assess their work.

Have fun!

Rad Resources:

For a great definition of performance-based assessments, check out Patricia Hilliard’s article on edutopia called Performance-Based Assessment: Reviewing the Basics or this booklet from Stanford School Redesign Network called What is Performance-Based Assessment? which includes research and examples of PBAs.

Check out this page on Edutopia for articles and videos on Project-Based Learning and this Research Spotlight on Project-Based Learning by the NEA.  Resources and Tools for PBL Start to Finish on edutopia is another great page with even more resources and links to help you get started.

For more information on developing performance-based assessments and rubrics, read Doug Wren’s AEA blog post on the topic and have a look at Ross Cooper’s blog post on Project-Based Learning Professional Development (part 2): Student Created Rubrics on ASCD Edge.

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

This is John Fischetti, Dean of Education/Head of School, at the University of Newcastle in Australia. We are one of Australia’s largest providers of new teachers and postgraduate degrees for current educators. We are committed to equity and social justice as pillars of practice, particularly in evaluation and assessment.

Lessons Learned:

It is with that equity lens that I want to share an Australian story.

In early May 2018, the Australian government launched a new report on the failure of Australian schools. It challenges the current schooling system by calling out the vestiges of the assembly line industrial age of education and the current lack of investment in “individualized” learning and future-focused skills. It calls for new types of online formative assessment and new progression of learning schemes to focus literacy and numeracy skills early and to reinvent years 11 and 12 of high school to more creative and innovation based.

The premise of this new scheme is line with the best thinkers in the world (from Guskey to Zhao) and the most progressive nations in the world (yes, sorry folks, Finland, Switzerland, Belgium and the Netherlands). However, the assessment recommendations are a reboot of more of the same. Assembly-line assessments in the early years are perhaps the opposite of how to boost literacy and numeracy early on. The report asks for massive changes to an assembly line reality by advocating for more assessment assembly-lines. And some of the recommendations in the report are already failing elsewhere, such as New Zealand’s system where young people can face a test a day.

Hot Tips:

I recommend that all of us who work in schools and with student performance data spend time this year advocating for reinventing the systems. We are to prepare children to be successful in their futures. To do that they need knowledge, skills and dispositions to be passionate, vibrant, dynamic, curious, open-minded, engaged (and literate and numerate) participants in their own journeys. We can’t assembly-line assess that.

One urban legend definition of insanity is “doing the same things over and over again and expecting better results.” When assembly line schooling is transformed to individualized learning, but the assessment scheme is from the same original mindset, we have the cart in front of the horse. And that is insane. “Stop, drop and test” assessment schemes are obsolete. It is time we in the field called this out and moved forward to build learning centers instead of testing centers. 

Rad Resources:

Gonski Review Attacks Australian Schooling Quality and Urges Individualized Teaching Approach

Thomas Guskey. What we know about pre assessments.

Yong Zhao: What Works Can Hurt: Side Effects in Education

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi, I’m Chad Green, program analyst at Loudoun County Public Schools in northern Virginia.  Over the past year I’ve been seeking developmental evaluation (DE) practitioners in school districts throughout the U.S. and abroad. Recently I had the pleasure of interviewing Keiko Kuji-Shikatani (C.E.) who is an educator and internal evaluator with the Ontario Ministry of Education. She also helped launch the Credentialed Evaluator designation process for the Canadian Evaluation Society (CES).

Credentialed Evaluators (currently 394 total) are committed to continuous professional learning, which is also the focus of DE as Keiko explained.  More specifically, DE “supports innovation development to guide adaptation to emergent and dynamic realities in complex environments” (Patton, 2010).  Keiko believes that DE is well-suited to public sector work in which adaptation and innovation are the norm in providing services given the changing realities of society.

Hot Tips:

  • The best way to introduce DE, whether to program/policy staff or senior leadership, is to be conscious that DE is about learning, and that when properly applied, evaluation capacity building is happening 24/7.
  • DE involves learning as you go which requires evaluators to engage in systems thinking so they can zoom in and out as they work and continue to co-create innovative solutions to complex challenges.
  • DE is not evaluation light. Developmental evaluators must have a thorough knowledge of evaluation so they can facilitate user-centric use of learning (i.e., a focus on utilization) gained from the DE approach in real time to tackle complex issues.

Keiko prefers to use conventional evaluation tools like logic models to co-construct a theory of change with the team of stakeholders, resulting in a shared understanding of the evolving evaluand. What is unique here is that she insists on describing their ideas in full sentences, much like the clear language used in the AEA Evaluator Competencies, rather than short phrases so as to avoid misunderstandings which are easy to make when complexity is the norm in huge systems such as hers.

Once the team members feel like the desired changes are plausible, she helps them to co-construct the theory of action so that they can collaboratively embed evaluative thinking in the way they work and make the changes feasible. She then takes the team further into what the year looks like to identify (a) the forks in the road where evaluation rigor is fundamental and (b) the use of appropriate data collection methods, analysis, and user-centric use of data so DE or “learning as we go” becomes the way the team makes sense of changing circumstances.

Rad Resources:

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Garima Bansal, a faculty member at University of Delhi, India where I teach a course on Assessment Education to teacher candidates enrolled in the Bachelors of Education (B.Ed) program. By reading DeLuca, Chavez, Bellara, & Cao’s paper featured in the journal The Teacher Educator in 2013 on pedagogies that inform assessment education, I was able to better understand the teaching and learning experiences in my own class. In particular, their four pedagogical activities – 1) Perspective-building conversations, 2) Praxis: connecting theory and practice, 3) Modelling: Practice what you preach, and 4) Critical reflection and planning for professional learning – reflected the teaching that occurred in my class. Drawing on these four pedagogical activities, in this article I articulate how I worked to develop assessment literacy of teacher candidates enrolled in the B.Ed program.

Lessons Learned:

  • Multiple perspective conversations: Teacher candidates were provided prompts, such as newspaper items and media reports which spoke about the side-effects of formal testing on various stakeholders (such as, “India’s examination system is only focussed on exam. Knowledge is not a priority” from the magazine Outlook India etc.) to reflect upon.

They shared their own experiences to build a diverse yet coherent vision about various school related evaluation issues.  They spoke about how shifts in evaluation policy came as a surprise to them in the year 2009 when Continuous and Comprehensive evaluation replaced class X formal examination and others.

  • Collaborative assessment projects: Collaborative projects between school teachers, University faculty, teacher candidates were initiated on different themes relating to student assessment. These projects enabled them to prepare differentiated assessment tasks for multiple ability learners studying in the same class, collecting samples of formative assessments tasks and critically analysing them for the nature of learning etc. Undertaking these projects and analysing them provided them with both a global as well as local perspective on different evaluation issues.
  • Modelling: All the formative assessment components were embedded across different courses of the teacher professional development program –Bachelors of Education (B.Ed), which is a two year program. This program had a specific paper on teacher assessment education – Assessment for Learning, which was taught to teacher candidates in the second year. Though teacher candidates were taught specific tools and techniques to create and conduct formative assessments in this course yet they were continually given practice in taking formative assessments themselves through other courses of the program as well. 
  • Addressing Assessment Dilemma: During internship in schools, teacher candidates were made to explore various assessment dilemmas faced by in-service teachers– between formative and summative, stressing on competition or cooperation etc. and propose their possible solutions.
  • Educational Evaluation Policy Analysis: Teacher candidates were made to reflect upon the shifts in educational evaluation policy across the world. They made projects in groups of four explicating the causes of shifts, strengths and weaknesses of the existing policies.
  • Reflections: They were constantly encouraged to reflect on the strengths and weaknesses on their assessment plans made for school students. I simultaneously sought their feedback on my assessment pedagogy used in this program. (Rad Resource: TCrunch App to get instantaneous feedback and for managing assignments is freely downloadable from Apple and Google Play store.)

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! We are Dana Linnell Wanzer, evaluation doctoral student, and Tiffany Berry, research associate professor, from the Youth Development Evaluation Lab at Claremont Graduate University. Today we are going to discuss the importance of high quality relationships with practitioners in evaluations.

“In the absence of strong relationships and trust, partnerships usually fail.”

-Henrick, Cobb, Penuel, Jackson, & Clark, 2017, p. 5

Research on factors that promote evaluation use often include stakeholder involvement as a key component (Alkin & King, 2017; Johnson et al., 2009). However, collaborations with practitioners are insufficient to promote use; rather, partners must also develop and maintain high quality relationships. For example, district leaders stress the importance of building productive relationships for promoting use of evaluations in their district (e.g., Harrison et al., 2017; Honig et al., 2017).

The importance of high quality relationships has been stressed through the focus on participatory or collaborative approaches to evaluation and through the inclusion of interpersonal factors in the evaluator competencies. Furthermore, utilization-focused evaluation (Patton, 2008) states that “evaluators need skills in building relationships, facilitating groups, managing conflict, walking political tightropes, and effective interpersonal communications” (p. 83) to promote use.

Lesson Learned: In our experiences as evaluators, the programs that have made the greatest strides in using evidence to inform decision-making are those who have a strong, caring relationship with the evaluation team. We genuinely want to see each other succeed; we are friendly and enjoy being together. We do not approach the relationship as a series of tasks to perform, but rather the relationship affords us the opportunity to dialogue honestly about the strengths, weaknesses, or gaps in programming that should be addressed. Without authentically enjoying each other’s’ company, it becomes a chore to meet and reduces the informal opportunities to chat about using evidence to improve programs.

Hot Tip: High quality relationships are characterized by factors such as:

  • Trust
  • Respect
  • Dependability
  • Warmth
  • Psychological safety
  • Long-term commitment to mutual goals
  • Liking one another and feeling close to each other

Rad Resource: King and Stevahn (2013) describe interactive evaluation practice as “the intentional act of engaging people in making decisions, taking action, and reflecting while conducting an evaluation study” (p. 14). They describe six principles for interactive evaluation practice: (1) get personal, (2) structure interaction, (3) examine context, (4) consider politics, (5) expect conflict, and (6) respect culture. They also provide 13 interactive strategies that can be used to promote positive interdependence among partners.

Rad Resource: Are you interested in assessing the effectiveness of your collaboration, especially its relationship quality? Check out the Collaboration Assessment Tool, especially the membership subscale!

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I’m Andrea Beesley, managing director in the education division at IMPAQ International, Through reliable evidence, IMPAQ helps governments, businesses, foundations, nonprofits, and universities enhance their programs and policies. Our primary markets are health, workforce development, social programs, education, and international development.

Our experience evaluating out-of-school-time (OST) curricula has given us the opportunity to work with many dynamic organizations and dedicated staff who support youth learning and recreation after the school day or during the summer. We’ve recently been reflecting on what we’ve learned about recruiting and working with OST programs.

Lessons Learned:

  • Coming from a day school evaluation background, I tend to think of recruiting as something best done as early as possible. We have found that some OST programs are not able to commit to participating in a study several months ahead of time, because they were anticipating leadership, staff, funding, and enrollment changes shortly before their annual startup date that could affect their ability to participate. Recruiting shortly before implementation worked best.
  • When scheduling staff professional development, we found that summer workshops before fall program startup were often not feasible because staff had not yet been hired. It was better to schedule them after school had started. Many afterschool program staff had other jobs during the day, so Saturday sessions were the most practical.
  • Youth and staff turn over frequently in OST programs, and youth attendance is often intermittent. Therefore it is important to use a design that does not depend on a large percentage of group members being present for both pre-program and post-program measures, or on youth attending every session of a curriculum implementation.
  • State afterschool networks can provide invaluable assistance in recommending, recruiting, and communicating with OST programs. It is helpful to partner with them, consult with them during the design and implementation of an evaluation, and provide sufficient funding in the evaluation budget for their work.

Rad Resource:

The 50 State Afterschool Network can link you to the network in each state.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello AEA 365! We are the Societal Advancement Insights & Impact team at the Center for Creative Leadership in Greensboro, NC. (Valerie Ehrlich, Tim Leisman, Micela Leis, and Jeff Kosovich.)

Professional networks among teachers play an important role in K-12 settings, where certain network characteristics have been linked to improved teacher practices and student outcomes. Building on Coburn & Russel’s 2008 work in Educational Evaluation and Policy Analysis, we use Social Network Analysis (SNA) to collect formative and summative evaluation data about program interventions to help teachers and administrators develop as leaders and improve performance.

Hot Tips:

  1. Consider the rationale for using SNA: Does the program focus on creating relationships?

CCL co-designed and co-facilitated a year-long professional development program for teachers at Ravenscroft, an independent K-12 school in Raleigh, NC. The goals of the program were to improve teachers’ competencies in facilitative teaching practice for student engagement and to strengthen connections across school levels and within school divisions through a purposeful cohort design. We administered a social network survey at both the whole school and individual cohort levels in order to assess the relationship-building aspect of the intervention.

  1. Consider a Pre-Post Design: How much will the network grow throughout the program?

Using a pre-post design is important to highlight the program’s effect. By using this in our work with Ravenscroft, we measured a notable increase in the teachers’ collaborative networks. On average, faculty reported 13 new collaborative connections each at the end of the program with colleagues they hadn’t collaborated with before the program – for a total of 1,635 new collaborative connections schoolwide!

Single Cohort Collaboration Networks diagram

  1. Dig Deeper: What can you say about the types of network connections?

Ask about different ‘levels’ or ‘types’ of relationships: who do you share ideas with? Who do you collaborate with regularly? Who do you seek support from? This helps you speak not only to the quantity of relationships, but the nature and quality as well.

  1. Examine Reciprocity: Are the connections superficial, or are individuals reporting the same relationships?

Compare measures of perceived connections with reciprocal connections. This allows evaluators to get an idea of how people perceive their networks as well as go beyond self-reporting to explore the nature of those relationships in practice.

  1. Mind the Survey Length: SNA can provide amazing information, but it is a time investment.

SNA can be burdensome for large groups when the survey presents a large list of people. One way to ease that burden is to use survey display logic in the following way: First, ask participants to select the people they know from the list. Then, pipe those responses forward to ask about the nature/quality of those relationships. It helps make the survey less daunting!

Rad Resources:

  • Do SNA in R! Contact our team at CCL for two tutorials about using R to conduct SNA.
  • Learn more about our work with Ravenscroft here and feel free to contact us.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I am Kelly Murphy, chair of the  PreK–12 Educational Evaluation Topical Interest Group (TIG)  and a senior research scientist at Child Trends. Welcome to our TIG’s sponsored AEA365 week! This week we are sharing lessons learned, evaluation tips, and resources for evaluators who work in PreK–12 educational evaluation.

When people think about the educational evaluation field, often the first thing that comes to mind is student achievement. And while many educational evaluators do focus on this critical area, educational evaluators also engage in evaluations related to such diverse topics as health and mental health, school climate and student safety, integrated student supports, and professional development and training of educators.

Across such a range of evaluations, PreK–12 evaluators use a wide array of quantitative and qualitative evaluation methods, engage a diverse group of stakeholders in collaborative and participatory evaluations, and build and test evaluation theory. Our evaluations involve diverse populations, including youth of color, children with disabilities, youth involved in the child welfare and juvenile justice systems, and children living in socioeconomically disadvantaged neighborhoods.

Given the diversity of our field, we know that our work as educational evaluators likely intersects with the work that you do. Throughout this week, we will share insights and tips from our work, but would love to hear and learn more about the work of other AEA members. We’d also love to learn about the evaluation contexts in which you practice, the stakeholder groups you engage, and your various interest areas.

Rad Resources:

Reach out to our TIG’s Leadership Team on social media or at PreK12.Ed.Eval.TIG@gmail.com!

  • TIG Websitehttp://comm.eval.org/prk12/home
  • Facebook: We have migrated conversations from our old community page to our GROUP page: https://www.facebook.com/groups/907201272663363/. Please come join our group, as we use Facebook to supplement our website and to communicate with each other, share ideas and resources, and just get to know friends, colleagues, and newcomers who have similar interests. Anyone who visits the page is welcome to post and share other links and resources with the group.
  • LinkedIn: Search for us on LinkedIn as PreK-12 Educational Evaluation TIG. This is a members-only group, so please send a request to join in order to see the content.
  • Twitter: We are tweeting with the user name PreK-12 Ed. Eval. Follow @PK12EvalTIG at https://twitter.com/PK12EvalTIG.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top