Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Search Results for: pre test survey questions

No More Crappy Survey Reporting – Best Practices in Survey Reporting for Evaluations by Janelle Gowgiel, JoAnna Hillman, Mary Davis, and Christiana Reene

Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

If you haven’t already, check out the first and second blog posts in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here) and No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations (which you can read here). Today, we’ll be following up with some tips on how to report your survey findings to different audiences and tips to engage partners throughout the survey process.

No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations by Janelle Gowgiel, JoAnna Hillman, Mary Davis, and Christiana Reene

Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

If you haven’t already, check out the first blog post in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here). Today, we’ll be following up with some tips on how to analyze your surveys (which, of course, you’ve made sure are not crappy!). Stay tuned for our final post of this series, on how to report your findings to different audiences.

A few Hot Tips for Designing Quality Survey Questions by Kim Firth Leonard and Sheila Robinson

Hello! We are Kim Firth Leonard, Leonard Research & Evaluation, LLC, and Oregon Community Foundation, and Sheila B. Robinson, Custom Professional Learning, LLC, co-authors of the text Designing Quality Survey Questions (Sage, 2018). We met on Twitter and now have been writing about survey design together for 10 years via our blogs, working with clients to design surveys, teaching survey design workshops, and presenting about survey design at conferences, including AEA.

No More Crappy Surveys: Best Practices in Survey Design for Evaluations by Janelle Gowgiel, JoAnna Hillman, and Christiana Reene

Janelle, JoAnna, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at the AEA Summer Evaluation Institute, teaching fellow evaluators how best to plan, design, execute, analyze, and report on surveys. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

YFE TIG Week: Accessing Youth Expertise in Designing Surveys for Youth by Mary Murray

Greetings. I’m Mary Murray, founder of of MEMconsultants. For two decades my team has built the capacity of youth-serving organizations to use evaluation practices to strengthen programs. Consistently, this has involved gathering information from youth via end of program surveys, interviews and focus groups. Often, we incorporate activities that expand beyond simply collecting information from …

YFE TIG Week: Accessing Youth Expertise in Designing Surveys for Youth by Mary Murray Read More »

GAO Week: Using Evidence and Rigor to Answer Complex Questions and Assess Important Programs by Lawrance L. Evans, Jr.

Good people doing good work, Lawrance L. Evans, Jr. here—Managing Director of the U.S. Government Accountability Office’s Applied Research and Methods team. My team houses the agency’s technical and methodological specialists with expertise ranging from cost-benefit analysis and data analytics to survey methods and future-oriented analyses. As GAO ingests important questions from Congress, our network …

GAO Week: Using Evidence and Rigor to Answer Complex Questions and Assess Important Programs by Lawrance L. Evans, Jr. Read More »

NA TIG Week: Using a Needs Assessment Double Scale Survey Method to Pilot Test a Survey Instrument by Sue Hamann

I’m Sue Hamann, an evaluator for 40 years, and employed at the National Institutes of Health as a Health Scientist and Science Evaluation Officer.  Many people who are not trained in the analytic sciences and social research sciences are called upon to conduct evaluations.  For the last several years, I’ve tried to bring issues about …

NA TIG Week: Using a Needs Assessment Double Scale Survey Method to Pilot Test a Survey Instrument by Sue Hamann Read More »

EPE TIG Week: Testing Ways to Inspire Conservation Action by Shuli Rank

Hi everyone, my name is Shuli Rank and I’m a Research and Evaluation Associate at the Wildlife Conservation Society (WCS). WCS runs the Bronx Zoo, Central Park Zoo, New York Aquarium, Prospect Park Zoo, and Queens Zoo, and conducts local and global conservation and research activities in 16 priority regions around the world. The focus …

EPE TIG Week: Testing Ways to Inspire Conservation Action by Shuli Rank Read More »

Analyzing surveys with complex sampling designs? There’s an R package for that! By Monique Farone

Hi there! I’m Monique Farone (@moniquefarone), an internal evaluator at a local health department just outside Atlanta. At the 2018 American Public Health Association annual meeting, I presented a poster that examined data from our local Youth Risk Behavior Survey (YRBS) – data that I miraculously analyzed using R. I say that because I’m new …

Analyzing surveys with complex sampling designs? There’s an R package for that! By Monique Farone Read More »