Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Memorial Day Week: Community-Based Participatory Research in the Military: Lessons Learned by Stephen Axelrad

Hello, my name is Stephen Axelrad – the founder and chair of the Military and Veteran Evaluation Topical Interest Group.  I am posting this content on community-based participatory research in the military on behalf of a colleague.

Stephen Axelrad
Stephen Axelrad

U.S. Service Members are frequently asked to take surveys from both local installations as well as the Department of Defense and academic partners. The concept of survey fatigue has become so widespread among U.S. Service Members that this topic has been the focus of numerous academic partners and government agencies.

While participation rates are low, there are techniques that everyone can utilize to ensure that evaluations have a greater buy-in and subsequent uptake within military populations.

Hot Tips:

1. Key Leader Interviews: Installation leadership often has a good idea of what issues are present at their installation prior to any evaluation. Key leaders should be interviewed in one-on-one sensing sessions to determine where the perceived gaps are for a given program. These interviews will also allow for more “buy-in” due to the involvement of members of the target population. However, key leaders may not be aware of all issues which leads to Point 2.

2. Focus Groups with Service Members: If possible, hold focus groups prior to evaluations to determine the domains that should be surveyed. Thematic analysis can then reveal what domains should be evaluated which were not provided in the interviews with key leaders. These focus groups also provide a forum for service members to be candid about their experiences.

3. Testing with Service Members: If the intended audience is Service Members, the survey should be understood by Service Members. Make sure to Beta test your survey/evaluation with junior enlisted Service Members to determine if any questions are unclear. This will reduce the chances that questions are misinterpreted with the entire survey population.

4. Utilization of Military Operations Orders (OPORD): Response rates for online surveys typically hover around 20% and can be even lower for flagship military surveys like the Health Related Behaviors Survey which was only 9% in 2015. To ensure Service Members respond, issuing an OPORD will ensure that the survey comes from a known party (the installation) as opposed to another entity which is more unfamiliar.

5. Open-Ended Final Question: While focus groups are a rich source of qualitative data, the presence of others may introduce social desirability bias (i.e., not wanting to speak up in fear of being judged). An open-ended final question can allow for rich data capture and honest responses to inform evaluators of any other issues that may be present but weren’t necessarily captured by the survey.

If you’re more interested in learning more, I encourage you to read the remaining posts of this week, join our TIG via the AEA web site, and contact me at stephen.h.axelrad@gmail.com.

The American Evaluation Association is celebrating Memorial Week in Evaluation. The contributions this week are from members of the Military and Veteran Evaluation TIG featuring contributions to evaluation with military origins but relevant to all we do. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.