AEA365 | A Tip-a-Day by and for Evaluators

TAG | policy

Greetings and welcome from the Disabilities and Underrepresented Populations TIG week.  We are June Gothberg, Chair and Caitlyn Bukaty, Program Chair.  This week we have a strong line up of great resources, tips, and lessons learned for engaging typically underrepresented population in evaluation efforts.

You might have noticed that we changed our name from Disabilities and Other Vulnerable Populations to Disabilities and Underrepresented Populations and may be wondering why.  It came to our attention during 2016 that sever of our members felt our previous name was inappropriate and had the potential to be offensive.  Historically, a little under 50% of our TIGs presentations represent people with disabilities, the rest are a diverse group ranging from migrants to teen parents.  The following Wordle shows the categorical information of presentations our TIGs presentation

Categories represented by the Disabilities and Underrepresented Populations presentations from 1989-2016

TIG members felt that the use of vulnerable in our name set up a negative and in some cases offensive label to the populations we represent.  Thus, after discussion, communications, and coming to consensus we proposed to the AEA board that our name be changed to Disabilities and Underrepresented Populations.

Lessons Learned:

  • Words are important! Labels are even more important!
  • Words can hurt or empower, it’s up to you.
  • Language affects attitudes and attitudes affect actions.

Hot Tips:

  • If we are to be effective evaluators we need to pay attention to the words we use in written and verbal communication.
  • Always put people first, labels last. For example, student with a disability, man with autism, woman with dyslexia.

The nearly yearlong name change process reminded of the lengthy campaign to rid federal policy and documents of the R-word.  If you happened to miss the Spread the Word to End the Word Campaign, there are several great video and other resources at r-word.org.

High School YouTube video

YouTube Video – Spread the Word to End the Word

 

 

 

 

 

 

https://www.youtube.com/watch?v=kTGo_dp_S-k&feature=youtu.be

Bill S. 2781 put into federal law, Rosa’s Law, which takes its name and inspiration for 9-year-old Rosa Marcellino, removes the terms “mental retardation” and “mentally retarded” from federal health, education and labor policy and replaces them with people first language “individual with an intellectual disability” and “intellectual disability.” The signing of Rosa’s Law is a significant milestone in establishing dignity, inclusion and respect for all people with intellectual disabilities.

So, what’s in a name?  Maybe more than you think!

 

· · · · · · ·

I’m Brian Yoder. I live in Washington, D.C., and I serve as president of the local AEA affiliate, the Washington Evaluators.  I’m writing about an initiative I spearheaded called Evaluators Visit Capitol Hill which took place during the AEA conference in Washington, D.C. last year.

Clipped from http://washingtonevaluators.roundtablelive.org/EVCH

EVCH is a collaboration between AEA’s Evaluation Policy Task Force (EPTF) and Washington Evaluators (WE).  EPTF provided policy documents related to the role of evaluation in government; WE organized AEA members to visit the office of their congress person.  During the AEA conference in D.C. last year, AEA members visited the office of their congress person, dropped off documents related to the role of evaluation in government, spoke about evaluation, and asked if anyone in the office would be interested to be contacted by a member of EPTF to further discuss/provide additional information about evaluation.

A total of 69 participants from 31 states and the District of Columbia signed-up and participated in an initial training conference call.  The federal government shut-down for two weeks prior to the AEA conference, creating challenges for some to schedule appointments with their congress person’s office.  Eighteen participants, visiting twenty-one different congressional offices, completed a post meeting survey. One third of the congressional offices visited by an AEA member said they were interested in receiving additional materials from EPTF.  AEA members reported having opportunities to speak with congressional staff about issues related to evaluation of government programs.  An unanticipated outcome of the government shutdown was some AEA members were able to meet with their representative or senator since they were available due to the government closer.

My hope is that this initiative helped accomplish three things:

  1. Make more policy makers aware of AEA and the work of EPTF.
  2. Expand the reach of EPTF by creating connections for EPTF.
  3. Give evaluators the opportunity to be part of the early policy-making process by providing materials on evaluation to policy makers prior to the policy being made.

We plan to continue and expand the initiative the next time AEA’s annual meeting is in Washington, D.C.  Please be on the look-out for additional information on how you can participate in EVCH in the run up to the next AEA conference in Washington, D.C.

Rad Resource: A Rad Resource I would like to share is the Washington Evaluators, one of the oldest local affiliates. If you are ever in D.C., please join us for one of our storied brown bag sessions or other events.  You’ll find information on the website: washingtonevaluators.org/events

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello, I am Maxine Gilling, Research Associate for Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP). I recently completed my dissertation entitled How Politics, Economics, and Technology Influence Evaluation Requirements for Federally Funded Projects: A Historical Study of the Elementary and Secondary Education Act from 1965 to 2005. In this study, I examined the interaction of national political, economic, and technological factors as they influenced the concurrent evolution of federally mandated evaluation requirements.

Lessons Learned:

  • Program evaluation does not take place in a vacuum. The field and profession of program evaluation has grown and expanded over the last four decades and eight administrations due to political, economic, and technological factors.
  • Legislation drives evaluation policy. The Elementary and Secondary Education Act (ESEA) of 1965 established policies to provide “financial assistance to local educational agencies serving areas with concentrations of children from low-income families to expand and improve their educational program” (Public Law 89-10—Apr. 11, 1965). This legislation also had another consequence: it helped drive the establishment of educational program evaluation and the field of evaluation as a profession.
  • Economics influences evaluation policy and practice. For instance in the 1980’s evaluation took a downturn due to the stringent economic policies. Program evaluators resorted to lessons learned through writing journals and books.
  • Technology influences evaluation policy and practice. The rapid emergence of new technologies all contributed to changing goals, standards, and methods and values underlying program evaluation.

Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · · ·

Greetings from Boise, the city of trees! We are Rakesh Mohan (director) and Margaret Campbell (administrative coordinator) of Idaho’s legislative Office of Performance Evaluations (OPE). Margaret reviews drafts of our reports from a nonevaluator’s perspective, as well as copyedits and desktop publishes each report. In this post, we share our thoughts on the importance of writing evaluation reports with users in mind. Some of our users are legislators, the governor, agency officials, program managers, the public, and the press.

Lessons Learned: Writing effective reports for busy policymakers embraces several criteria, such as logic, organization, and message. But in our experience, if your writing doesn’t have clarity, the report will not be used. Clear writing takes time and can be difficult to accomplish. We have examined some reasons why reports may not be written clearly and declare these reasons to be myths:

Myth 1: I have to dumb down the report to write simply. Policymakers are generally sharp individuals with a multitude of issues on their minds and competing time demands. If we want their attention, we cannot rely on the academic writing style. Instead, we write clear and concise reports so that policymakers can glean the main message in a few minutes.

Myth 2: Complex or technical issues can’t be easily explained. When evaluators thoroughly understand the issue and write in active sentences from a broad perspective, they can explain complex and technical issues clearly.

Myth 3: Some edits are only cosmetic changes. Evaluators who seek excellence will welcome feedback on their draft reports. Seemingly minor changes can improve the rhythm of the text, which increases readability and clarity.

Our goal is to write concise, easy-to-understand reports so that end users can make good use of our evaluation work. We put our reports through a collaborative edit process (see our flowchart) to ensure we meet this goal. Two recent reports are products of our efforts:

Equity in Higher Education Funding

Reducing Barriers to Postsecondary Education

Hot Tips

  1. Have a nonevaluator review your draft report.
  2. Use a brief executive summary highlighting the report’s main message.
  3. Use simple active verbs.
  4. Avoid long strings of prepositional phrases.
  5. Pay attention to the rhythm of sentences.
  6. Vary your sentence length, avoiding long sentences.
  7. Write your key points first and follow with need-to-know details.
  8. Put technical details and other nonessential supporting information in appendices.
  9. Minimize jargon and acronyms.
  10. Use numbered and bulleted lists.
  11. Use headings and subheadings to guide the reader.
  12. Use sidebars to highlight key points.

Rad Resources

  • Revising Prose by Richard A. Lanham
  • Copyediting.com
  • Lapsing Into a Comma by Bill Walsh

We’re celebrating Data Visualization and Reporting Week with our colleagues in the DVR AEA Topical Interest Group. The contributions all this week to aea365 come from our DVR members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting DVR resources. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

·

Hi. I’m Anna Williams, Senior Associate at Ross & Associates Environmental Consulting, in Seattle, Washington.

Advocates, their funders, and policy advocacy evaluators seek to understand the results of policy advocacy work. Advocates promote the adoption (or reversal) of government policies, and many use the term “wins” to refer to successful milestones in their advocacy work. However, this term is often undefined and lacks context. “Wins” may mean many things: endorsements from public figures, favorable policy proposals, government bodies voting favorably, passage of desired policies into law, etc. Contribution/attribution aside, upon examination, the term “win” may or may not have a meaningful relationship to actual policy change.

Lessons Learned – The stark reality: Policy change is typically not linear; and it’s a long-term endeavor. The work can be downright messy. Progress one year can be weakened or reversed the next. Some policies are very weak by the time they are passed; others may have unforeseen consequences or fatal flaws. Later, implementation may be anything but guaranteed. Context matters. Policy work varies from place to place, country to country, venue to venue: one size does not fit all. There are windows of opportunity during which significant and durable policy change can occur quickly; however, these are the exceptions.

When parties claim policy “wins” we could ask for more precision. One philanthropy I work with has moved from “win” to “policy adoption” and defines the latter as follows: “Decision-makers have adopted, approved, or otherwise agreed to the policy or action; implementation is not yet underway.” (This philanthropy also defines stages preceding and following “policy adoption,” while acknowledging the limitations of this linear framework.)

A clearer view on policy (and advocacy) progress and “wins” can have a sobering effect, especially when we acknowledge the slow pace and volatility of policy change. But we need to help funders be realistic about the long-term nature of policy advocacy work, and avoid illusions concerning return on investment. The advocacy community need not be apologetic about these realities; however, it may take time to close the delta between funder expectations and on-the-ground realities. We all need to tell funders what they need to hear – not what they want to hear.

Lessons Learned: As policy advocacy evaluators, we should encourage advocates, policy advocacy funders, and the evaluation community to be clear about “wins” and to unapologetically convey that, even under ideal advocacy conditions, policy change takes time and even then can be vulnerable.

How do others view this issue? How do others define and track policy progress? What have others experienced when having these kinds of discussions with advocates and their funders?

We’re celebrating Advocacy and Policy Change week with our colleagues in the APC Topical Interest Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello everyone. My name is Ioana Munteanu, and I am a Social Science Analyst with the Smithsonian Institution’s Office of Policy and Analysis. The Smithsonian consists of 19 Museums and the National Zoo; 9 Research Centers; and many other centers, programs and projects. Central to the Office of Policy and Analysis mission is to assist upper management at all levels in making sound decisions on improving the Institution’s exhibitions and programs for physical and virtual visitors and for stakeholders. Dr. Carole Neves directs our Office, which is composed of 12 skilled staff with diverse backgrounds, assisted by fellows and interns from both the United States and other countries.  Upon request we conduct formative, process and summative evaluations of both formal and informal programs and exhibitions that are offered on-site, off-site and online; the studies may be Institution-wide or focused on a particular Smithsonian unit, or department or program within a unit. The wonderful news is that over 100 of our studies are available online for FREE.  The link to our website is discussed below.

Rad Resource: Studies of visitors to the Smithsonian provide a glimpse into: who comes here for general museum visits and publicly available offerings, and answers why; how satisfied they are with their visit and what experiences they had; and what factors contributed to their satisfaction and experiences. These studies include formative assessments conducted during preparatory phases, as well as those looking at the output and impact of offerings. Staff employs a wide range of methodologies including, but not limited to: quantitative surveys—in person and online; qualitative interviewing; focus groups; observations; visitor tracking; and other methods such as card sorting or concept mapping.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Namaste! I am Rakesh Mohan, Director of Idaho State Legislature’s Office of Performance Evaluations. I currently serve on the AEA Board and am a former executive committee member of the National Legislative Program Evaluation Society (NLPES). I hope this post will inform you of some good evaluation resources at the state government level.

This past July, in Louisville (KY), NLPES recognized excellence in evaluation at the annual summit of the National Conference of State Legislatures (NCSL). My office along with the Utah’s Office of the Legislative Auditor shared the Excellence in Research Methods Award. The Excellence in Evaluation Award went to Florida’s Office of Program Policy Analysis and Government Accountability.

Resources: Legislatures in many US states depend on their evaluation shops for providing independent and non-partisan information on a broad range of everyday issues like education, health, social services, corrections, and transportation. If you are interested in some good evaluation and policy analysis work, I encourage you to check the NLPES link that will connect you to every member state office that is doing evaluation work.

To my knowledge, there are at least two other legislative offices that conduct evaluation and policy work but are not listed on the NLPES link: Washington State Institute for Public Policy and the California Legislative Analyst’s Office.

Hot Tip: The work of these legislative offices ranges from limited focus performance audits to large and complex program evaluations and policy analyses. The nature and scope of work varies from one office to another depending on the authorizing environment and sociopolitical context in which the evaluation office operates. For example, in recent years, my office has focused on larger policy issues in public education, corrections, and transportation. The success of these evaluation offices depends largely on how well they adapt to their dynamic environments and respond to the information needs of their sponsors and stakeholders.

The American Evaluation Association is celebrating Government Evaluation Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOVT TIG members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting Government-focused evaluation resources. You can also learn more from the GOVT TIG via their many sessions at Evaluation 2010 this November in San Antonio.

· · ·

Archives

To top