AEA365 | A Tip-a-Day by and for Evaluators

CAT | Prek-12 Educational Evaluation

Hi, I’m Chad Green, program analyst at Loudoun County Public Schools in northern Virginia.  Over the past year I’ve been seeking developmental evaluation (DE) practitioners in school districts throughout the U.S. and abroad. Recently I had the pleasure of interviewing Keiko Kuji-Shikatani (C.E.) who is an educator and internal evaluator with the Ontario Ministry of Education. She also helped launch the Credentialed Evaluator designation process for the Canadian Evaluation Society (CES).

Credentialed Evaluators (currently 394 total) are committed to continuous professional learning, which is also the focus of DE as Keiko explained.  More specifically, DE “supports innovation development to guide adaptation to emergent and dynamic realities in complex environments” (Patton, 2010).  Keiko believes that DE is well-suited to public sector work in which adaptation and innovation are the norm in providing services given the changing realities of society.

Hot Tips:

  • The best way to introduce DE, whether to program/policy staff or senior leadership, is to be conscious that DE is about learning, and that when properly applied, evaluation capacity building is happening 24/7.
  • DE involves learning as you go which requires evaluators to engage in systems thinking so they can zoom in and out as they work and continue to co-create innovative solutions to complex challenges.
  • DE is not evaluation light. Developmental evaluators must have a thorough knowledge of evaluation so they can facilitate user-centric use of learning (i.e., a focus on utilization) gained from the DE approach in real time to tackle complex issues.

Keiko prefers to use conventional evaluation tools like logic models to co-construct a theory of change with the team of stakeholders, resulting in a shared understanding of the evolving evaluand. What is unique here is that she insists on describing their ideas in full sentences, much like the clear language used in the AEA Evaluator Competencies, rather than short phrases so as to avoid misunderstandings which are easy to make when complexity is the norm in huge systems such as hers.

Once the team members feel like the desired changes are plausible, she helps them to co-construct the theory of action so that they can collaboratively embed evaluative thinking in the way they work and make the changes feasible. She then takes the team further into what the year looks like to identify (a) the forks in the road where evaluation rigor is fundamental and (b) the use of appropriate data collection methods, analysis, and user-centric use of data so DE or “learning as we go” becomes the way the team makes sense of changing circumstances.

Rad Resources:

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Garima Bansal, a faculty member at University of Delhi, India where I teach a course on Assessment Education to teacher candidates enrolled in the Bachelors of Education (B.Ed) program. By reading DeLuca, Chavez, Bellara, & Cao’s paper featured in the journal The Teacher Educator in 2013 on pedagogies that inform assessment education, I was able to better understand the teaching and learning experiences in my own class. In particular, their four pedagogical activities – 1) Perspective-building conversations, 2) Praxis: connecting theory and practice, 3) Modelling: Practice what you preach, and 4) Critical reflection and planning for professional learning – reflected the teaching that occurred in my class. Drawing on these four pedagogical activities, in this article I articulate how I worked to develop assessment literacy of teacher candidates enrolled in the B.Ed program.

Lessons Learned:

  • Multiple perspective conversations: Teacher candidates were provided prompts, such as newspaper items and media reports which spoke about the side-effects of formal testing on various stakeholders (such as, “India’s examination system is only focussed on exam. Knowledge is not a priority” from the magazine Outlook India etc.) to reflect upon.

They shared their own experiences to build a diverse yet coherent vision about various school related evaluation issues.  They spoke about how shifts in evaluation policy came as a surprise to them in the year 2009 when Continuous and Comprehensive evaluation replaced class X formal examination and others.

  • Collaborative assessment projects: Collaborative projects between school teachers, University faculty, teacher candidates were initiated on different themes relating to student assessment. These projects enabled them to prepare differentiated assessment tasks for multiple ability learners studying in the same class, collecting samples of formative assessments tasks and critically analysing them for the nature of learning etc. Undertaking these projects and analysing them provided them with both a global as well as local perspective on different evaluation issues.
  • Modelling: All the formative assessment components were embedded across different courses of the teacher professional development program –Bachelors of Education (B.Ed), which is a two year program. This program had a specific paper on teacher assessment education – Assessment for Learning, which was taught to teacher candidates in the second year. Though teacher candidates were taught specific tools and techniques to create and conduct formative assessments in this course yet they were continually given practice in taking formative assessments themselves through other courses of the program as well. 
  • Addressing Assessment Dilemma: During internship in schools, teacher candidates were made to explore various assessment dilemmas faced by in-service teachers– between formative and summative, stressing on competition or cooperation etc. and propose their possible solutions.
  • Educational Evaluation Policy Analysis: Teacher candidates were made to reflect upon the shifts in educational evaluation policy across the world. They made projects in groups of four explicating the causes of shifts, strengths and weaknesses of the existing policies.
  • Reflections: They were constantly encouraged to reflect on the strengths and weaknesses on their assessment plans made for school students. I simultaneously sought their feedback on my assessment pedagogy used in this program. (Rad Resource: TCrunch App to get instantaneous feedback and for managing assignments is freely downloadable from Apple and Google Play store.)

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! We are Dana Linnell Wanzer, evaluation doctoral student, and Tiffany Berry, research associate professor, from the Youth Development Evaluation Lab at Claremont Graduate University. Today we are going to discuss the importance of high quality relationships with practitioners in evaluations.

“In the absence of strong relationships and trust, partnerships usually fail.”

-Henrick, Cobb, Penuel, Jackson, & Clark, 2017, p. 5

Research on factors that promote evaluation use often include stakeholder involvement as a key component (Alkin & King, 2017; Johnson et al., 2009). However, collaborations with practitioners are insufficient to promote use; rather, partners must also develop and maintain high quality relationships. For example, district leaders stress the importance of building productive relationships for promoting use of evaluations in their district (e.g., Harrison et al., 2017; Honig et al., 2017).

The importance of high quality relationships has been stressed through the focus on participatory or collaborative approaches to evaluation and through the inclusion of interpersonal factors in the evaluator competencies. Furthermore, utilization-focused evaluation (Patton, 2008) states that “evaluators need skills in building relationships, facilitating groups, managing conflict, walking political tightropes, and effective interpersonal communications” (p. 83) to promote use.

Lesson Learned: In our experiences as evaluators, the programs that have made the greatest strides in using evidence to inform decision-making are those who have a strong, caring relationship with the evaluation team. We genuinely want to see each other succeed; we are friendly and enjoy being together. We do not approach the relationship as a series of tasks to perform, but rather the relationship affords us the opportunity to dialogue honestly about the strengths, weaknesses, or gaps in programming that should be addressed. Without authentically enjoying each other’s’ company, it becomes a chore to meet and reduces the informal opportunities to chat about using evidence to improve programs.

Hot Tip: High quality relationships are characterized by factors such as:

  • Trust
  • Respect
  • Dependability
  • Warmth
  • Psychological safety
  • Long-term commitment to mutual goals
  • Liking one another and feeling close to each other

Rad Resource: King and Stevahn (2013) describe interactive evaluation practice as “the intentional act of engaging people in making decisions, taking action, and reflecting while conducting an evaluation study” (p. 14). They describe six principles for interactive evaluation practice: (1) get personal, (2) structure interaction, (3) examine context, (4) consider politics, (5) expect conflict, and (6) respect culture. They also provide 13 interactive strategies that can be used to promote positive interdependence among partners.

Rad Resource: Are you interested in assessing the effectiveness of your collaboration, especially its relationship quality? Check out the Collaboration Assessment Tool, especially the membership subscale!

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I’m Andrea Beesley, managing director in the education division at IMPAQ International, Through reliable evidence, IMPAQ helps governments, businesses, foundations, nonprofits, and universities enhance their programs and policies. Our primary markets are health, workforce development, social programs, education, and international development.

Our experience evaluating out-of-school-time (OST) curricula has given us the opportunity to work with many dynamic organizations and dedicated staff who support youth learning and recreation after the school day or during the summer. We’ve recently been reflecting on what we’ve learned about recruiting and working with OST programs.

Lessons Learned:

  • Coming from a day school evaluation background, I tend to think of recruiting as something best done as early as possible. We have found that some OST programs are not able to commit to participating in a study several months ahead of time, because they were anticipating leadership, staff, funding, and enrollment changes shortly before their annual startup date that could affect their ability to participate. Recruiting shortly before implementation worked best.
  • When scheduling staff professional development, we found that summer workshops before fall program startup were often not feasible because staff had not yet been hired. It was better to schedule them after school had started. Many afterschool program staff had other jobs during the day, so Saturday sessions were the most practical.
  • Youth and staff turn over frequently in OST programs, and youth attendance is often intermittent. Therefore it is important to use a design that does not depend on a large percentage of group members being present for both pre-program and post-program measures, or on youth attending every session of a curriculum implementation.
  • State afterschool networks can provide invaluable assistance in recommending, recruiting, and communicating with OST programs. It is helpful to partner with them, consult with them during the design and implementation of an evaluation, and provide sufficient funding in the evaluation budget for their work.

Rad Resource:

The 50 State Afterschool Network can link you to the network in each state.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello AEA 365! We are the Societal Advancement Insights & Impact team at the Center for Creative Leadership in Greensboro, NC. (Valerie Ehrlich, Tim Leisman, Micela Leis, and Jeff Kosovich.)

Professional networks among teachers play an important role in K-12 settings, where certain network characteristics have been linked to improved teacher practices and student outcomes. Building on Coburn & Russel’s 2008 work in Educational Evaluation and Policy Analysis, we use Social Network Analysis (SNA) to collect formative and summative evaluation data about program interventions to help teachers and administrators develop as leaders and improve performance.

Hot Tips:

  1. Consider the rationale for using SNA: Does the program focus on creating relationships?

CCL co-designed and co-facilitated a year-long professional development program for teachers at Ravenscroft, an independent K-12 school in Raleigh, NC. The goals of the program were to improve teachers’ competencies in facilitative teaching practice for student engagement and to strengthen connections across school levels and within school divisions through a purposeful cohort design. We administered a social network survey at both the whole school and individual cohort levels in order to assess the relationship-building aspect of the intervention.

  1. Consider a Pre-Post Design: How much will the network grow throughout the program?

Using a pre-post design is important to highlight the program’s effect. By using this in our work with Ravenscroft, we measured a notable increase in the teachers’ collaborative networks. On average, faculty reported 13 new collaborative connections each at the end of the program with colleagues they hadn’t collaborated with before the program – for a total of 1,635 new collaborative connections schoolwide!

Single Cohort Collaboration Networks diagram

  1. Dig Deeper: What can you say about the types of network connections?

Ask about different ‘levels’ or ‘types’ of relationships: who do you share ideas with? Who do you collaborate with regularly? Who do you seek support from? This helps you speak not only to the quantity of relationships, but the nature and quality as well.

  1. Examine Reciprocity: Are the connections superficial, or are individuals reporting the same relationships?

Compare measures of perceived connections with reciprocal connections. This allows evaluators to get an idea of how people perceive their networks as well as go beyond self-reporting to explore the nature of those relationships in practice.

  1. Mind the Survey Length: SNA can provide amazing information, but it is a time investment.

SNA can be burdensome for large groups when the survey presents a large list of people. One way to ease that burden is to use survey display logic in the following way: First, ask participants to select the people they know from the list. Then, pipe those responses forward to ask about the nature/quality of those relationships. It helps make the survey less daunting!

Rad Resources:

  • Do SNA in R! Contact our team at CCL for two tutorials about using R to conduct SNA.
  • Learn more about our work with Ravenscroft here and feel free to contact us.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I am Kelly Murphy, chair of the  PreK–12 Educational Evaluation Topical Interest Group (TIG)  and a senior research scientist at Child Trends. Welcome to our TIG’s sponsored AEA365 week! This week we are sharing lessons learned, evaluation tips, and resources for evaluators who work in PreK–12 educational evaluation.

When people think about the educational evaluation field, often the first thing that comes to mind is student achievement. And while many educational evaluators do focus on this critical area, educational evaluators also engage in evaluations related to such diverse topics as health and mental health, school climate and student safety, integrated student supports, and professional development and training of educators.

Across such a range of evaluations, PreK–12 evaluators use a wide array of quantitative and qualitative evaluation methods, engage a diverse group of stakeholders in collaborative and participatory evaluations, and build and test evaluation theory. Our evaluations involve diverse populations, including youth of color, children with disabilities, youth involved in the child welfare and juvenile justice systems, and children living in socioeconomically disadvantaged neighborhoods.

Given the diversity of our field, we know that our work as educational evaluators likely intersects with the work that you do. Throughout this week, we will share insights and tips from our work, but would love to hear and learn more about the work of other AEA members. We’d also love to learn about the evaluation contexts in which you practice, the stakeholder groups you engage, and your various interest areas.

Rad Resources:

Reach out to our TIG’s Leadership Team on social media or at PreK12.Ed.Eval.TIG@gmail.com!

  • TIG Websitehttp://comm.eval.org/prk12/home
  • Facebook: We have migrated conversations from our old community page to our GROUP page: https://www.facebook.com/groups/907201272663363/. Please come join our group, as we use Facebook to supplement our website and to communicate with each other, share ideas and resources, and just get to know friends, colleagues, and newcomers who have similar interests. Anyone who visits the page is welcome to post and share other links and resources with the group.
  • LinkedIn: Search for us on LinkedIn as PreK-12 Educational Evaluation TIG. This is a members-only group, so please send a request to join in order to see the content.
  • Twitter: We are tweeting with the user name PreK-12 Ed. Eval. Follow @PK12EvalTIG at https://twitter.com/PK12EvalTIG.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Steve Kimball, a researcher and evaluator with the Wisconsin Evaluation Collaborative at the UW-Madison, Wisconsin Center for Education Research. We have recently embraced a Networked Improvement Community (NIC) approach to learn from and with schools using varying approaches to personalized learning. I have been intrigued with the implications of the NIC concept as a form of participatory and utilization-focused evaluation.

The Personalization in Practice Networked Improvement Community (PiPNIC) focuses on developing or refining student conferring protocols with five schools, to help teachers and students engage in productive learning conversations. Each school team includes 4-5 teachers and school leaders. The teams are meeting with our research group over four Saturday sessions during a 90-day cycle. Between sessions, the teams reflected on current student conferring practices and developed and refined conferring protocols. They are now testing their protocols using scripts, taking notes using brief reflection forms, and using videos to capture the student-teacher discussions.

A UW-Madison research team led by Professor Richard Halverson facilitates the NIC. The work is part of a larger partnership with the Wisconsin Department of Public Instruction, funded by the U.S. Department of Education Institute of Education Sciences, to develop resources supporting Wisconsin’s state longitudinal data system.

Lessons Learned:

  • On the research side, it is time intensive to recruit, orient and support participants in the NIC process. Extensive preparation preceded the actual work. The evaluation team recruits and convenes participants, facilitates problem discovery and networking meetings, and helps participants with data collection and analysis. Similarly, practitioners must also commit time and personnel resources to participate in the NIC, and develop and test the protocols.
  • Learning by doing involves risk and unpredictable results. For this first project, it was important to recruit practitioners already engaged in cutting-edge personalized learning practices. These educators were willing to take on new challenges because the problems were anchored in their practice and addressed an immediate need.
  • Benefits of this approach includes collaboration with peers within schools and across schools and deep ownership potential of the process and results. Teachers and leaders can immediately see the results and put them to use for improvement. Participants have said the work has represented high quality professional learning. Collaboration within and across schools created a mutually supporting venture into the unknown.

The NIC model provides a great structure for participatory evaluation. We are eager to explore the approach with others, and engage in the next 90-day cycle with participating schools and districts.

Rad Resources:

Explore the virtual honeycomb from Cooperative Educational Service Agency 1 for summary of personalized education.

Carnegie 90-Day Cycle Handbook

The main text that the hub facilitators shared with their school teams is from:

Getting Ideas into Action – the Network Improvement Community Model for Professional Learning.

Also see, Carnegie Foundation’s Learning to Improve

The American Evaluation Association is celebrating The Wisconsin Idea in Action Week coordinated by the LEAD Center. The LEAD (Learning through Evaluation, Adaptation, and Dissemination) Center is housed within the Wisconsin Center for Education Research (WCER) at the School of EducationUniversity of Wisconsin-Madison and advances the quality of teaching and learning by evaluating the effectiveness and impact of educational innovations, policies, and practices within higher education. The contributions all this week to aea365 come from student and adult evaluators living in and practicing evaluation from the state of WI. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, everyone! I’m Leigh M. Tolley, Visiting Assistant Professor, Secondary Education at the University of Louisiana at Lafayette (UL Lafayette) and Chair of the PreK-12 Educational Evaluation TIG. I have been extremely fortunate to become involved with the Vermilionville Education Enrichment Partnership, or VEEP, an academic service learning collaboration between Vermilionville, a living history museum and folklife park, UL Lafayette, and the Lafayette Parish School System (LPSS). Through VEEP, and under the mentorship of UL Lafayette faculty, pre-service elementary and secondary social studies and English/language arts teachers prepare and implement interdisciplinary lessons with LPSS students that are rooted in Acadian, Native American, and Creole cultures.

My colleagues at UL Lafayette, Drs. Toby Daspit, Natalie Keefer, and Micah Bruce-Davis, and our friends at Vermilionville, Ms. Melanie Harrington, Education Coordinator, and Mr. Brady McKellar, Director of Museum Operations, have helped me to think more about how educational experiences outside of a school setting can impact PreK-12 students and their teachers, as well as college-level students preparing for teaching careers.

Hot Tip: Obtain input from as many people as possible!

Data about each “VEEP Day” experience have been obtained from survey instruments administered to participating students and their teachers and the Vermilionville guides that accompanied them, as well as UL Lafayette students. The UL Lafayette faculty review our students’ lessons and conduct informal observations during their implementation, and follow up with our pre-service teachers about their experiences. The VEEP team uses all of this information to get a well-rounded picture of the day and use the findings for continuous change and improvement and meaningful curricular and program enrichment.

Hot Tip: Variation is key.

For over five years, VEEP has provided both exciting opportunities for area students and multiple ways that we can learn about educational program evaluation in a living history museum context. Various evaluation approaches and data collection methods are critical to formative evaluations of UL Lafayette students’ interdisciplinary lessons, summative evaluations of each VEEP Day, and developmental evaluation of the program.

Lesson Learned: Effective collaboration can widely impact learning.

The VEEP program has helped Vermilionville to build stronger educational ties with the community, including area pre-service and in-service teachers, elementary and secondary students, school board administrators, and university faculty, while also adding to its educational resources for future visitors with the lesson plans and instructional materials that are created for this partnership.

Rad Resource:

Lesson plans created through VEEP are shared on the Educate section of Vermilionville’s website. These include the anchor activities that are conducted at the village, as well as pre- and post-lessons for elementary and secondary teachers to use in the classroom.

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi! Our names are Carrie Wiley and Matt Reeder and we are Senior Research Scientists at the Human Resources Research Organization (HumRRO). We would like to share an abbreviated version of our demonstration session presented at the 2016 annual meeting in Atlanta on how to create map data in R. It sounds like a daunting task, but it is far easier than it seems.

In addition to the many tools and resources that exist to help guide evaluators to create more effective tables and graphs, geographic mapping could also be a great benefit to identify and demonstrate geographical patterns. The use of Geographic Information System (GIS) mapping as an effective evaluation tool might be perceived by many as a rather intimidating technique, since most evaluators are not formally trained in GIS. In our work, we often deal with naturally occurring large-scale data (e.g., state-level data, school districts, counties, ZIP codes) that can be displayed in more effective ways than a traditional table. Drawing maps really just requires coordinates, and for very basic maps, R provides those coordinates in a nicely formatted file.

Hot Tips:

All you need to get started is:

GIS Basics:

In order to map data, you need to draw boundaries. Those boundary data are in shapefiles (.shp) which contain latitude and longitude coordinates of the boundaries you want to draw. The Census Bureau TIGER files (Topologically Integrated Geographic Encoding and Referencing) make various cartographic boundary shapefiles available for download, or you can use built-in R packages that essentially pull the data for you.

Mapping the Data:

Our example plots a heatmap of the number of craft breweries in each state.

  1. Retrieve the publicly available craft brewery directory: https://www.brewersassociation.org/directories/breweries/

2. Install the following R packages:

 a. library(dplyr)

 b. library(ggplot2)

 c. library(mapproj)

3. Data excerpt:

4. Load the boundary data from maps() (a ggplot() dependency):

 a. states <- map_data(“state”)

 b. Data excerpt:

5. Get counts of breweries by state and merge with the coordinates file:

6. Plot the heatmap:

So, based on this map, if you are an avid fan of craft beer, California, Washington, and Colorado are good places to check out. Of course, these are raw counts—creating a heatmap that accounts for population density would be more useful. If you are a coffee drinker, find a publicly available coffee shop database and practice your new skills plotting a heatmap of coffee shops! 

Rad Resources:

Using different combinations of R packages and Census data, you can make heatmaps by county, and school districts, and bubble charts by ZIP code.

Useful Census data:

Useful R packages

  • library(zipcode)
  • library(maps)

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Greetings! We are Kate LaVelle, Research Associate, and Judith Rhodes, Associate Professor of Research, from the Office of Social Service Research and Development (OSSRD) at Louisiana State University. At OSSRD we write large federal grants to support educational, place-based initiatives for school districts and communities with significant need in southern Louisiana. In this post, we share our lessons learned and tips based on our grant writing experiences.

Hot Tip: Grant applications require a description of the need being addressed; however, applications vary in how much direction they give for presenting information on needs. For example, some applications ask for results from a completed needs assessment or segmentation analysis. Other applications require you to discuss needs within preset categories, such as academic, health, or community needs. To cover these common requirements, we find it helpful to create a Gaps and Solutions table. This concisely presents evidence-based specific gaps that are linked to particular solutions, providing a clear justification for proposed services based on identified needs.

Here is an excerpt from a sample Gaps and Solutions table:

Hot Tip: When writing grant applications that incorporate complex approaches, we find it useful to develop an Intervention Design table that includes the detailed information that funding agencies typically want to know. For example, the table below contains information about who and how many individuals will be served, the cost of services per participant, plans for scaling up services over time, and the funding sources for each planned strategy. We include a list of key partners to show the important collaborations, as well as research-based evidence backing the proposed strategies. This table can also be helpful for communicating the intervention design to colleagues working on other parts of the grant, such as the budget or evaluation sections.

Lessons Learned:

  • Be purposeful in where you place tables in the grant application. For example, we have found that a Gaps and Solutions table works well at the end of the Needs section as a way to summarize key gaps and solutions, as well as provide a transition into the Program Design section, which typically follows. However, a more detailed Intervention Design table might be best placed in the Appendix if page space is limited, assuming that the table is sufficiently referred to in the narrative.
  • If feasible, hire a graphic designer (or graphic design student if cost is an issue) to create a logo specifically for your proposed initiative. We find having a professional logo adds a polished look to the application, as well as provides a visual branding that potential funders may be more likely to remember.

Rad Resource: Grants.gov is a helpful resource for exploring different types of education grants. Federal departmental websites also have previously-awarded proposals available to view, which can provide more ideas of ways to effectively present your next grant proposal. After all, if previously used strategies were successful for another applicant, they might work for you!

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

Older posts >>

Archives

To top