AEA365 | A Tip-a-Day by and for Evaluators

Hello! My name is Valerie Futch Ehrlich and I am the Evaluation and Research Lead for the Societal Advancement group at the Center for Creative Leadership. My team focuses on supporting our K-12, higher education, non-profit, and public health sector initiatives through evaluation and research. I want to share with you our recent experience using pulse surveys to collect feedback from school-wide faculty on a professional development initiative.

Pulse surveys” are short, specific, and actionable surveys intended to collect rapid feedback that is immediately utilized to inform the direction of a program, activity, or culture. Through our partnership with Ravenscroft School, we used a pulse survey midway through a (mandated) year-long professional development experience and timed it so that the pulse feedback would inform the next phase of programming.

We used Waggl, a tool designed for pulse surveys, that has a simple interface to include either yes/no questions, agreement scales, or one open-ended question. A neat feature of Waggl is that it allows for voting as long as the pulse is open, encouraging participants to read the open-ended responses of their peers and vote on them. This way, you can have the most actionable requests filter up to the top based on voting, and it can help drive decisions.

In our case, the Waggl responses directly informed the design of the second phase of training. We also repeated the Waggl toward the end of the school year to quickly see if our program had its intended impact, to provide ideas for a more comprehensive evaluation survey, and to inform the next year of work with the school.

Hot Tips:

  • Keep your pulse survey short! This helps ensure participation. It should be no more than 5-10 questions and take less than a minute or two.
  • Pulse survey results are quick fodder for infographics! Waggl has this functionality built in, but with a little tweaking you could get similar information from a Google Form or other tools.
  • Consider demographic categories that might provide useful ways to cut the data. We looked at differences across school levels and how different cohort groups were responding, which helped our program designers further tailor the training.
  • Pulse surveys build engagement and buy-in…when you use them! Faculty reported feeling very validated by our use of their feedback in the program design. The transparency and openness to feedback by our design team likely increased faculty buy-in for the entire program.

Lesson Learned:

Think outside the box for pulse surveys. Although they are popular with companies for exploring employee engagement, imagine using them with parents at a school, mentors at an after-school program, or even students in a classroom giving feedback to their instructor. There are many possibilities! Any place you want quick, useful feedback would be a great place to add them. In our next phase of work, we are considering training school leaders to send out their own pulse surveys and incorporate the feedback into their practices. Stay tuned!

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, fellow aea365 readers! My name is Leigh M. Tolley, and I am the Chair of the PreK-12 Educational Evaluation Topical Interest Group (TIG). Our TIG welcomes you to our series of posts for Teacher Appreciation Week!

As a former high school teacher and current Visiting Assistant Professor, Secondary Education at the University of Louisiana at Lafayette, I am always interested in learning more about educational evaluation and how it can benefit students, teachers, communities, school and university faculty and staff that work with pre- and in-service teachers, and the myriad other stakeholders and groups that are impacted by our work. To kick off this week, I would like to share some information about our TIG to help us all learn about and collaborate with each other.

Last year, our TIG distributed a survey to our members to try to learn more about us, our interests, and ways in which we would like to be more involved in the TIG and AEA. Although we had a small number of respondents in proportion to our entire TIG membership, this is what we know about ourselves so far:

Lesson Learned: Our TIG members are seasoned evaluators!

Of the 21 respondents to our survey, the majority have been practicing evaluators for over a decade.

Lesson Learned: Our members come from a range of organizations!

Here is a breakdown of the contexts in which the respondents worked:

 

Lesson Learned: Benefits of TIG involvement!

The top reasons why respondents joined and stay involved with our TIG were networking, staying current on the latest evaluation methods and findings, sharing best practices, and advancing the field of evaluation.

Rad Resources:

We’d love to hear more from the many other members of our TIG, and AEA members in general! In what context do you practice, what are your interests, and how would you like to become more involved? Explore our social media links below, and contact our TIG’s Leadership Team at PreK12.Ed.Eval.TIG@gmail.com!

  • TIG Website: http://comm.eval.org/prk12/home
  • Facebook: We have migrated conversations from our old community page to our GROUP page: https://www.facebook.com/groups/907201272663363/ . Please come “join” our group, as we use Facebook as a supplement to our website and as a place where we can communicate with each other, share ideas and resources, and just get to know friends, colleagues, and newcomers alike who have similar interests. Anyone who visits the page is welcome to post and share other links and resources with the group.
  • LinkedIn: Search for us on LinkedIn as PreK-12 Educational Evaluation TIG. This is a “members only” group, so please send a request to join in order to see the content.
  • Twitter: We are “tweeting” with the user name PreK-12 Ed. Eval. Follow @PK12EvalTIG at https://twitter.com/PK12EvalTIG.

 

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Good morning!  I’m Liz Zadnik, aea365’s Outreach Coordinator and Saturday contributor.  Part of my role on the curating team is working with evaluators and researchers interested in generating content for the blog.  Writing for the web is a little different than drafting an evaluation report, policy brief, or peer-review journal article – it requires a slightly more conversational and informal tone.  I’ve pulled together a few tips and resources for folks interested in refining their online-writing style.

Hot Tip: Frontload your information. Basically, put the most interesting or poignant nuggets first.  This is a little different than most of the resources you may usually write – results or findings are typically contextualized first and then outlined later.  Not online.  Blog and website visitors are looking for something – give them what they want.  They’ll peruse a page, scanning for keywords.  If they don’t see what they’re looking for, they’ll leave.  

Lesson Learned: White space is your friend.  Many people equate dense paragraphs with quality – that won’t do for online content!  Embrace patches of white space – throughout the page and also within the content.  “How do I do that?!”  Well, you can use bulleted or numbered lists, images, or line breaks between paragraphs.  Don’t worry if you feel it looks sparse – your readers will thank you!    

Hot Tip: Get active!  With your voice, that is.  Writing for the web is intended to keep the visitor engaged for short period of time.  Folks have something in mind when they visit a site and want to be spoken to directly.  Active voice helps create that atmosphere – it also makes blocks of text for readable and scannable.  

FROM “The participants’ questions were gathered by the meeting facilitator.” (passive)

TO “The meeting facilitator gathered participants’ questions.” (active)

Just to be clear, passive voice isn’t bad.  It has its place in scientific and academic writing.  But blogs and websites are different and should look and sound different.  This style can be difficult to practice at first, but I’ve found it has strengthened my writing both professionally and personally.   

Rad Resources:

  • Usability.gov offers a checklist and more tips on effectively writing for the web.
  • Writing Spaces pulled together a style guide a few years ago – it has some nice background on different platforms and “genres” of web writing
  • Speaking of style guides, Sum of Us offers a very thoughtful one, A Progressive’s Style Guide, for folks interested in harnessing language as a tool for social change. 

I would also encourage you to pay attention to blogs and websites you really like.  How do they use white space?  How/Do they offer a scannable page for visitors?  What information do they offer?  

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Marti Frank, a researcher and evaluator based in Portland, Oregon. Over the last three years I’ve worked in the energy efficiency and social justice worlds, and it’s given me the opportunity to see how much these fields have to teach one another.

For evaluators working with environmental programs – and energy efficiency in particular – I’ve learned two lessons that will help us do a better job documenting the impacts of environmental programs.

Lessons Learned:

1) A program designed to address an environmental goal – for example, reduce energy use or clean up pollution, will almost always have other, more far reaching impacts. As evaluators, we need to be open to these in order to capture the full range of the program’s benefits.

Example: A weatherization workshop run by Portland non-profit Community Energy Project (where I am on the Board), teaches people how to make simple, inexpensive changes to their home to reduce drafts and air leaks. While the program’s goal is to reduce energy use, participants report many other benefits: more disposable income, reduced need for public assistance, feeling less worried about paying bills, having more time to spend with family.

2) Not all people will be equally impacted by an environmental program, or even impacted in the same way. Further, there may be systematic differences in how, and how much, people are impacted.

Example #1: Energy efficiency programs assign a single value for energy savings, even though the same quantity of savings will mean very different things to different households, depending in large part on their energy burden  (or the percent of their income they spend on energy).

Example #2: A California energy efficiency program provided rebates on efficient household appliances, like refrigerators. Although the rebates were available to everyone, the households who redeemed them (and thus benefited from the program) were disproportionately wealthy and college-educated, relative to all Californians.

Rad Resources:

I’ve found three evaluation approaches to be helpful in identifying unintended impacts of environmental programs.

Outcome harvesting. This evaluation practice encourages us to look for all program outcomes, not just those that were intended. Ricardo Wilson-Grau, who developed it, hosts this site with materials to get you started.

Intersectionality. This conceptual approach originated in feminist theory and reminds us to think about how differing clusters of demographic characteristics influence how we experience the world and perceive benefits of social programs.

Open-ended qualitative interviews. It’s hard to imagine unearthing unexpected outcomes using closed-ended questions. I always enjoy what I learn from asking open-ended questions, giving people plenty of time to respond, and even staying quiet a little too long. And, I’ve yet to find an interviewee who doesn’t come up with another interesting point when asked, “Anything else?”

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Kara Crohn and Matt Galport here – we’re consultants with EMI Consulting, an evaluation and consulting firm based in Seattle, Washington that focuses on energy efficiency and renewable energy programs and policies. More than ever, evaluators must consider how their clients’ programs impact the well-being of the communities and environments in which they are embedded. It is also important for evaluators to consider how their clients’ program goals relate to state, national, or global sustainability goals. In this post, we offer five types of systems-oriented sustainability metrics that evaluators can use to connect clients’ program contributions to broader environmental, economic, health, and social metrics of well-being.

But first, what do we mean by “sustainability”?

In this post, we’re not talking about the longevity of the program, but rather the extent to which a program’s outcomes, intended or otherwise, contribute to or detract from the future well-being of its stakeholders. We are also using an expanded definition of “stakeholders” that includes communities and environmental resources affected by the program.

Hot Tip:

Consider incorporating these five types of sustainability metrics into your next evaluation:

#1: Public health: The extent to which a program contributes to or detracts from the health of program and community stakeholders

#2: Environment and energy: The extent to which a program implements environmental and energy conservation policies that support resource conservation

#3: Community cohesion: The extent to which a program promotes or detracts from the vibrancy and trust of the communities in which it is embedded

#4: Equity: The extent to which a program contributes to or detracts from fair and just distribution of resources

#5: Policy and governance: The extent to which a program’s policies support civil society, democratic institutions, and protect the disadvantaged

So, what would this look like in practice?

Here’s an example of how to connect program-specific metrics for a small, local after-school tutoring program to the broader set of social goals.

Rad Resources:

Resources for municipal and global sustainability metrics:

Municipal: STAR Rating system for U.S. cities

Global: United Nation’s Sustainable Development Goals

Continue the conversation with us! Kara kcrohn@emiconsulting.com and Matt mgalport@emiconsulting.com.

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Allison Van and I am currently an evaluator at the Clinical and Translational Science Awards (CTSA) program at the University of North Carolina in Chapel Hill and the owner of Allison Van Consulting.  Previously I managed The Pasture Project for Winrock International, which was an effort to build a movement among farmers in the Midwest to reintegrate livestock rotation both for greater profit and environmental benefit.  The project benefitted from funders that were willing to take chances with us, allowing for a budget where resources could be shifted to account for new information or opportunities.  Our strategies were highly diverse – demonstration sites on farms, supporting a collaboration of farmer educators, training dedicated farmers in public speaking – yet all directed at influencing the decision-making of individual farmers. Some strategies were about direct influence while others focused on building capacity – in both cases results wouldn’t be seen for years and were highly dependent on external circumstances.

As both the program manager and default evaluator, my goal was to test strategies relatively quickly, rigorously, and cheaply – then modify, end or expand them within 6-18 months.  I needed an approach for the team to compare the development of different strategies so money could be funneled where it was most likely to make a difference.  Understanding that the core challenge was one of budget allocation amidst uncertainty and long time horizons was critical to selecting the right evaluation approach.

Rad Resources: The combination of Michael Quinn Patton’s Developmental Evaluation and E. Jane Davidson’s Real Evaluation were my constant guides to developing an evaluation approach and making decisions in the context of extreme uncertainty and long time horizons.

Hot Tip:  There are profound trade-offs and opportunity costs in social change, making value for money a critical measure of program effectiveness.  How programs invest their resources can be the most fundamental determinant of success.  A bootstrap method combining cost-effectiveness analysis, the logic model for each strategy, and rubrics of early stage indicators of behavior change allowed us to thoughtfully consider how to make and shift investments.

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I am Marcie Weinandt and have been working with Minnesota rural and agricultural communities my entire career, as an elected official, program manager and policy developer.  My state, “The Land of 10,000 Lakes” has had to face a hard truth: water quality in Minnesota is being threatened by agricultural field runoff.  My current work is as operations coordinator of the Minnesota Water Quality Certification Program (MAWQCP), designed to bridge the urban/rural divide and protect water quality by providing the regulatory certainty farmers need and the assurance the public demands.

MAWQCP has pioneered a new model of conservation delivery that works on a field-by-field, whole farm basis to identify and mitigate agricultural risks to water quality. Once a farmer has mitigated their farm’s risks to water quality the farmer is eligible to become certified and sign a 10 year contract with the State indicating the certified farmer will be in compliance with any new state water laws or rules. Through the contract farmers receive the regulatory certainty they need to make long-term decisions and the general public is assured that farmers are managing their operations to protect water.

Central to the program’s success is the collaboration among Minnesota’s state agencies. The Minnesota Departments of Agriculture, Natural Resources, Pollution Control Agency, the Board of Water and Soil Resources all support the program, uphold the contract provision of regulatory certainty, and are implementing additional benefits to MAWQCP-certified farmers within their respective agencies.

Recognizing early that this intergovernmental MAWQCP has several partners, funding streams and constituents, we realized it did not fit neatly into any single evaluation approach. Multiple evaluation methods were developed at inception to triangulate expected project outcomes. Formative Knowledge, Attitude and Practice (KAP) surveys were used during the pilot phase to inform program direction and to set a baseline. Later, summative KAP surveys yielded a second database against which behavioral changes could be measured in specific watersheds over time. In addition, advisory committee members were interviewed, and a post-certification farmer survey was done. MAWQCP gathers information on three other levels: environmental, participatory and political.

Lessons Learned:

  • Farmers have a very high concern for water quality and especially for reducing soil erosion.
  • They are also concerned about public perception of agriculture.
  • KAP Study revealed that technical assistance from a trusted source and that financial assistance was appreciated but not necessary to adopt and maintain an agricultural conservation or management practice.

Rad Resources:

MAWQCP:  Knowledge, Attitudes and Practices (KAP) Study Final Report June 20, 2016

This KAP Study was conducted to better inform the implementation process of the MAWQCP.

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Karlyn Eckman and I research the human dimensions of natural resources programs in Minnesota and also in developing countries.  While working in Somalia in the 1980s I learned to use the KAP (knowledge, attitudes and practices) study method to evaluate project outcomes. Since 2006 our University of Minnesota team has conducted about forty KAP studies on a variety of environmental projects.  Most studies assess whether people have adopted and maintained recommended conservation practices, or acquired new knowledge about an environmental issue.

One such project is the Native Shoreland Buffer Initiatives (NSBI) project of the Minnesota Department of Natural Resources.  NSBI encouraged shoreland property owners on three northern Minnesota lakes to adopt conservation practices to improve water quality. Many lakes are impaired by excess nutrients such as nitrates and phosphorus from agricultural operations, or bacteria from septic systems.  The NSBI designed customized conservation messages for landowners based on specific impairments, and tested different ways of delivering those messages (radio programs, social clubs and gatherings, peer-to-peer messaging, visits by technical experts, brochures, etc.).

The KAP study provided us with an economical and focused way of planning an evaluation-ready project. We began with a “gap exercise” to review what we didn’t know about landowners, and would need to know before designing NSBI outreach efforts. Based on the gaps we identified, we developed a questionnaire that was administered as a formative evaluation. Outreach messages were based upon survey results. Two years later we repeated the survey. Data from the two surveys was directly comparable. We had solid evidence that the NSBI was not only effective in changing people’s behaviors and adopting conservation practices, but also in improving their knowledge about water quality. We also learned which message delivery approaches were most effective: peer-to-peer messaging, social gatherings and visits by technical experts.

KAP studies are effective in many environmental projects. We have used method to evaluate:

* The outcome of water quality projects on urban and suburban residents (for example, changes in knowledge and attitudes of residents about local water bodies)

* Effectiveness of training (for example whether snowplow drivers reduce the application of de-icing chemicals)

* Adoption of best practices by farmers to reduce soil erosion and agrochemical use

* Cost-effectiveness of water quality projects as measured by reduced purchases of road salt by county public works departments

KAP studies can be used at any scale, with qualitative or quantitative data, and with multiple methods.  We have used KAP studies to evaluate the human dimensions of invasive species, use of agricultural practices, recreational behaviors, shifting cultivation in developing countries, application of road salt by snowplow drivers, and many other issues. In each case the KAP study method has provided critical data demonstrating project success, which is important to donors and state agencies wanting evidence of the value of investment of public resources in environmental efforts.

Rad Resources

The final evaluation report for the NSBI project can be found here.

Photo: The author (left) and team triangulating survey data on Johnson Lake in Itasca County, Minnesota

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Rupu Gupta, Conservation Psychologist, Co-Chair of AEA’s Environmental Program Evaluation Topical Interest Group, and Researcher at NewKnowledge.org, a research and evaluation think tank.

For the past five years, I have evaluated the EECapacity project, supported by a cooperative agreement between Cornell University and the US Environmental Protection Agency. It aimed to expand the critical role environmental education (EE) plays in fostering healthy environments and communities. The project’s overall strategy was to link an emerging cadre of diverse EE professionals working in urban environmental stewardship, community, and environmental justice organizations with established environmental educators who are active in nationwide professional and government networks. A number of resources, online learning opportunities, and professional development activities, were employed to engage groups of environmental educators across the country.

A number of key insights emerged from the findings that are relevant for evaluating environmental programs, and as Earth Week approaches, even more so in thinking about the intersections between environmental and human outcomes.

Lessons Learned:

  • Environmental educators in the US are more racially and ethnically diverse than is documented through professional associations – this means, that when evaluating environmental projects focused on diversity, we have to be mindful of the characteristics of the population of interest to understand change
  • Environmental educators hold multiple perspectives about the goals of EE – perception of apparent differences in EE’s purpose (to connect kids with the outdoors versus to foster youth leadership in community gardening) often overlooks shared outcomes for young audiences) 
  • Culturally responsive approaches are critical to initiate relationship building between educators affiliated with professional EE and those aligned with community-focused goals – the processes of interaction are as important as outcomes of potential partnerships.
  • Social identities tied to groups defining unique approaches to environment-focused work are important to consider – if the goal is to foster collaboration, a proxy may be the development of a superordinate identity that recognizes the shared goals of groups that have not previously connected.
  • Differences in power and status are inherent in environmental projects with stakeholders from organizations ranging from national-level EE organizations to community focused programs – project outcomes will be meaningfully interpreted only when the evaluation context and interactions between the key stakeholders are honestly examined.

Earth Week is a reminder of the global imperative to protect our planet’s flora and fauna, in ways that complement social goals of altruism, equity and justice. For evaluating environmental programs, it is a humbling moment for us to reflect on the motivated, human-led actions and approaches that create environmental and inevitably societal change.

Rad Resources:

For more information:

The American Evaluation Association is celebrating Environmental Program Evaluation TIG Week with our colleagues in the Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello all! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor with a Hot Tip and Rad Resource for presentation designers!

On March 30, we unveiled a new, reorganized and freshened up Potent Presentations Initiative (p2i) website. Here’s what you’ll find:

  1. On the p2i HOME page, you’ll find a brief introduction to p2i, and our 3 key components – Message, Design, and Delivery. Webinars for each provide in-depth learning and reference some of the resources found on the PRESENTATION TOOLS & GUIDELINES page.
  2. All downloadable resources live on the PRESENTATION TOOLS & GUIDELINES page. The page is organized with Checklists & Worksheets on top, then resources aligned to the p2i components – Message, Design, and Delivery – followed by resources for Audience Engagement. As you browse this page, you’ll find links to additional content and pages along with the tools. Just look for tool titles that are links, as in this example: Notice that “Slide Design Guidelines” is a link. This will take you to another page of content on Slide Design. Another key addition is that the authors who contributed the content are now recognized and their names linked to their websites or LinkedIn profiles.
  3. Given that posters are the largest category of presentations at our annual conference, POSTER PRESENTATIONS warranted its own page. Here, you’ll find a page with specific guidelines for designing a conference poster, along with two additional navigation buttons. One takes you to more content on Research Poster Design, while the other points to  Award Winning Posters,  from recent AEA conferences, and other organizations. Each poster image is accompanied by a brief explanation of what makes it a winner.
  4. Don’t forget to visit the ABOUT US page to learn about the folks who have contributed to making p2i what it is!
  5. We now have a hashtag that is all ours: #aeap2i. Please tweet about the p2i website and resources using this tag. Follow the hashtag #aeap2i by clicking on the top button found on the p2i HOME page, and while you’re at it, why not follow the association itself (@aeaweb) as well! 

Behind the scenes…

Over the last year, we’ve worked to migrate and reorganize all content from the original p2i website to the main AEA site at eval.org (kudos to Zachary Grays, who did the heavy lifting!). We updated the tools, and added new content and introductory language where needed. One reason for the move was to protect us from hackers. Our original site, built on a different platform, was a constant target and over the years we received countless notices from members that the site URL had been maliciously redirected (meaning it took people to a different website), or that downloads were not working. We’re confident now the new site and all of our great content will be safe and reliable.

Be sure to visit eval.org/p2i and let us know what you think!

Sneak Preview! We have exciting new content for our p2i resource collection on its way to publication. Stay tuned to learn more!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top