I am Laurene Johnson from Metiri Group, a research, evaluation, and professional development firm focusing on educational innovations and digital learning. I often work with school district staff to provide guidance and research/evaluation contributions to grant proposals, including those for submission to the National Science Foundation (NSF).
Programs like Discovery Research PreK-12 (DRK-12) present some interesting challenges for researchers and evaluators. Since I work at an independent research and evaluation firm, I don’t implement programs, I study them. This means that in order to pursue such funding, and research things I think are cool, I need to partner with school or district staff who do implement programs. Likely they implement them quite well, maybe even having some experience obtaining grant funding to support them. This is both a real advantage in writing an NSF proposal and a real challenge. A successful research partnership (and proposal) will involve helping the practitioners understand where their program fits into the entire proposed project. It will likely be difficult for these partners to understand that NSF is funding the research, and funding their program or innovation only because I’m going to research it. This can be a huge shift for people who have previously received funding to implement programs. Depending on the origin of the program, the individual I’m partnering with might also have a real attachment to the program, which can make it even more difficult to explain that it’s going to “play second fiddle” to the research in a proposal.
This is not an easy conversation to have but, if researchers are successful, we can likely open up many more doors in terms of partnership opportunities in schools.
Hot Tip: Be prepared to have the research-versus-implementation conversation multiple times. Especially, I think someone who has written many successful proposals will tend to revert back to what s/he knows and is comfortable with as the writing progresses.
Lesson Learned: Even if prior evaluations have indicated it might be effective, the client must clearly explain the research base behind the program design and components. My experience is that many programs in schools are designed around staff experience about what works, rather than having a foundation in what research says works (emphasizing instruction as an art rather than as a science). This may be fine for implementing the program, but falls short of funders’ expectations in terms of designing an innovation in a research context.
Hot Tip: Try to get detailed information about the program in very early conversations, so you can write the research description as completely as possible. Deliver this to the client as essentially a proposal template, with the components they need to fill in clearly marked.
The American Evaluation Association is celebrating Research vs Evaluation week. The contributions all this week to aea365 come from members whose work requires them to reconcile distinctions between research and evaluation, situated in the context of STEM teaching and learning innovations.. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.