AEA365 | A Tip-a-Day by and for Evaluators

My name is Elissa Schloesser with Visual Voice, I am a freelance graphic designer specializing in infographics, data visualization and reporting. I enjoy making complex information more understandable and engaging. I have partnered with several evaluators to help visually communicate evaluation methods and findings.

Below are a few techniques I use to make logic models, theory of change and other process diagrams more visually appealing and digestible.

Hot Tip: Start by considering your diagram’s purpose and audience. Edit content accordingly.

Is your diagram intended to be analyzed up close by your reader, or is it intendd to provide a visual overview of your model or process? If its purpose is to be used as a summary, only include the most important and relevant information.

Hot Tip: Establish a hierarchy of information and apply a consistent design style to each level.

Not all information should get equal visual weight and real estate. The main concepts and connections should be the biggest and boldest, while the supporting details should be formatted to be smaller and lighter. Establish a design style for each level of information and make sure it is applied consistently throughout. This is especially important when you are working with a diagram that has lots of layers of information.

Hot Tip: Use color to enhance your diagram, not make the diagram.

I like to think of color as a bonus feature in any diagram. Selective color use can help emphasize connections in your diagram (but try not to get carried away). If you use every color in the rainbow, it tends to be less effective. Additionally, I like to test to see if my diagram is still understandable in grayscale even if it will most likely be viewed in color. If the diagram is not understandable without the color, go back and readjust the line weight, headings or iconography.

Rad Resource: Below is a sample process diagram to help illustrate the points above. It is similar in structure to what you might use for a logic model. This diagram could be recreated in Word or PowerPoint in a table with invisible borders.

Rad Resource: For more advice on how to make your diagrams more digestible, check out this blog post by Grank Denneman – “10 guidelines for creating good looking diagrams”.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello good people! My name is Robert Perez and I am a research assistant at Hamai Consulting and data analyst at Youth Policy Institute. I am responsible for cleaning and analyzing data from various sources and designing engaging ways to communicate findings using reports, dashboards, and infographics.

I spend some time at the beginning of the year planning out projects for my department using a number of different tools. One tool I love to use is the gantt chart. I searched for a way to automate the conditional formatting process of coloring each of the representative time units and stumbled upon a template from Chandoo.org.

(click for larger image)

This particular gantt chart allows the user to enter the starting week, project duration, and completion status, which will automatically populate the lines with colors depending on the department or project name. Formatting this chart took some “Excelbow” grease as the original chart did not have the option to colorize the tasks based on the department or project.

Updating tools like this gantt chart regularly will encourage its use and keep it from being lost in virtual oblivion. Keep in mind the technical skill of your audience and communicate with them during the design process to ensure that what you are creating will be of use to them.

Hot Tip: If you are having trouble understanding the function of a formula, I find it helpful to break out the formula into its component parts and paste each component into their own cells. This way, I can determine the return value of each nested formula, which gives me more context around how each component works together.

Hot Tip: No matter how often you design a visualization or a tool, the question of “who is your audience” will always come up.  Harvard Business Review offers some definitions of audience categories that might help with your design process:

Novice: first exposure to the subject; doesn’t want oversimplification

Generalist: aware of the topic, but looking for an overview understanding and major themes

Managerial: in-depth, actionable understanding of intricacies and interrelationships with access to detail

Expert: more exploration and discovery and less storytelling with great detail

Executive: only has time to glean the significance and conclusions of weighted probabilities

Rad Resource: Sometimes I need a bit of inspiration when designing my visuals. Fortunately, there is a veritable bevy of resources online from which we can fuel our creative engines. One of my favorite sites to visit is Contextures.com. The site caters to Excel novices, offering lessons about conditional formatting to VLOOKUPs, and those more experienced learners with lessons about automation using VBA. Of course, there is also Chandoo.org, a site that has one of the friendliest community forums ever.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Hello! I’m Julie Lamping, a research analyst at Harper College in Illinois. A lot of what I do is extracting data that is needed, formatting it into tables or charts, and providing a basic analysis to be used during decision making at the College. No matter the project, everything gets thrown into Excel at some point.

I absolutely love pivot tables and charts! I use them for simple validation, creating crosstabs and charts, and building dashboards. Excel has its limitations (*cough*itdieswhenthereistoomuchdata*cough*), so pivots aren’t miracles for everyone (just maybe for some of you).

Start with [fairly] clean data in Excel. Go to the INSERT tab and select PivotTable. Boom, now you have a pivot table (seriously, that’s it). Like everything Excel spits out at first, it is ugly. PivotTable Tools tab will help you design the table to look however you want. You can even group items together. LifeProTip: have your color palette ready by creating a custom style theme in Microsoft.

We’re data visualization people, so if you go back to your sheet with your data, Insert a PivotChart the same way you would a table (hint, it’s usually located by the other charts under the INSERT tab).
Format that bad boy with all the data visualization standards and skills you have. It looks so good (probably).

But say youwant to give someone else the option to filter the chart how they want – we can do that! While the PivotChart is selected, head on over to the ANALYZE tab and hit “Insert Slicer”. Select the fields from your PivotTable you want and click OK. LifeProTip: I use this function when creating dashboards in Excel.

Any selection on the slicer will filter your PivotTable and PivotChart. Hint, these slicers can also be customized to be appealing – I usually make mine gray but Harper’s blue when selected.

Extra fun dashboarding – you can move your PivotTables and Charts using the Move Chart function. If I’m creating a dashboard in Excel, I move my final Charts and Slicers over to new sheet (usually renamed DASH for my sanity) and hide the rest of my mess.

TL;DR: Select the first cell in your excel table and insert a PivotTable or PivotChart. Format and modify as needed. Insert Slicers if you are always extra like I am, then move to a separate sheet for quick dashboard.

Happy Excel’ing!

Rad Resources:

For newbies to Pivots in Excel: Ann Emery’s Introduction to Pivot Tables

Jon covers finalizing the dashboard: Youtube Video on Excel Dashboarding by Jon from Excel Campus

All things dashboard’ing in Excel: SmartSheet’s Post on Excel Dashboards

Shameless plug: My [sassy] tutorial

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Greetings from Minneapolis, Minnesota! I am Angie Ficek, a Program Evaluator at Professional Data Analysts (PDA), an independent consulting firm specializing in evaluating public health programs. We wanted to share three data viz related features in Office 2016 that are easy and helpful for spiffing up evaluation reports and presentations.

Cool Tricks:

1. Icons: Icons are a fantastic way to communicate ideas without words, and there are some wonderful websites out there from which to download icons. But, did you know that Office 2016 has its own icons? Simply go to the Insert tab, and you will find the Icons option between Shapes and SmartArt.

The downside is that there are far fewer options than, say, Noun Project, but the great part is that you can easily change the color of the icon (to any color!). Just click on the icon, go to the Format tab, and select a color from the Graphics Fill menu. Hats off to you, Microsoft!

2. Maps: You may have noticed that Office 2016 has some new chart types, one of which is a Map chart. My colleague showed me just how easy it is to create a map with this feature. She had a column of county names and a column of numbers, and simply by clicking Insert > Map, a map of that state was produced and the data was mapped and scaled accordingly. See an example of a default map below with some sample data.

You can plot country/region, state/province, county, or postal code, on these maps but not cities or street addresses. The color scale can be a sequential 2-color scale (as shown) or a diverging 3-color scale. To create the map, Microsoft sends your data to Bing Maps, which may or may not be an issue if you are mapping sensitive data.

3. Merge shapes: I recently stumbled across the Merge Shapes command in PowerPoint, which might not be a new feature, but it was new to me. I learned that it can do some fun things when merging a picture with words for added impact. Check it out:

To access the Merge Shapes command, select at least two shapes, images, or text boxes that you want to merge. From the Drawing Tools tab, select the Merge Shapes drop-down menu and choose one of the five commands shown below.

For a description of each Merge Shape command, see this Indezine article.

We are excited to keep exploring Office 2016 to see what other new features or not-yet-discovered features it has!

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! I’m Kate Diaz, Senior Manager for Corporate Measurement, TechnoServe. When working with data visualizations, it can sometimes be difficult to craft the message that drives home the key insight for your audience. That was an issue we faced as we began to plan for our 2016 Impact Report, which uses data visualizations to help readers understand key aspects of TechnoServe’s impact. Thanks in no small part to the  expert guidance from AEA’s Ann Emery, however, we finished with a report that makes a strong, clear argument for our impact.

Lesson Learned: While creating the report’s data visualizations, I learned how much the revisions process helps refine and isolate the message. When drafting a visualization, I often start out with three or four insights from the data that I want to convey to readers. Through revisions, by sharing the visualization with others, and putting it aside and coming back to it, I hone in on the exact message that drives the report’s narrative. The revisions process helps identify the message just as much as it helps identify the right visual.

Revisions included sketches on pen and paper, the app Paper54, Excel, and photos taken during Skype calls (Thanks again, Ann!)

For example, a key visualization in the 2016 Impact Report illustrates how we assess Financial Benefits, a measure of increased revenue and wages. Early iterations, shown above left, explored the idea of impact flows over time. But we were really interested in talking about 2016 impact, not a trendline. We tried a waterfall chart, which showed the composition of the whole but still accentuated a chronology. Doing away with a timeline, we tried stacked bar charts. These ultimately helped us identify the key insight for readers: that this year’s impact is a result of work we’re doing this year and the sustained impact from prior years. A few iterations later, our message was clear:

The finished product.

After so much effort, it can be hard to let go of the other hard-won insights about our impact in order to isolate the one that best fits the narrative. I’m often tempted to try to convey three or four different insights in one graph. But the result is a busy, messy visualization that is unsuccessful at driving any message home. TechnoServe’s annual impact report is a printed document, so space is limited, but online or presentation versions are more flexible. They are an opportunity to explore the insights that didn’t make it into the printed report.

Data visualization is certainly about the destination: it’s important to land on the graph that clearly conveys the intended message. But the journey, the revisions process, will ensure you get there. Happy graphing!

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello, my name is Jayne Corso and I am the Community Manager for AEA. Posting on multiple social media sites requires good imagery, and on a low budget this can be tough. Images make your content eye-catching and can even add context to a post. On all channels, posting with images out preforms those without images. Canva is an easy and free way to create your own graphics, charts, infographics, and images. Today, I will show you how to create an image using free Canva formats, layouts, and photos.

Rad Resource: Choose your format

Each social media channel has a preferred image size. This size will allow your photos to be clearly viewed in a newsfeed. Canva takes the guess work out, and helps you create images specifically for each channel. They have an array of sizes you can choose from. You can even create a custom design by entering your own dimensions. For this example, we will be choosing the Facebook post format.

Rad Resource: Find a Layout

Canva offer many free layout that you can edit with your own content. Simply click on the layout you like and it will be added to your canvas.

Rad Resource: Edit your image

Once you have selected your desired layout, you can now add photos and text to your image. If you have a photo you would like to use, simply upload it to Canva under “uploads”. If you don’t have a photo, you’re in luck. Canva offers high quality stock photos for free. Browse the collection and find the one that works for your graphic. Once you find the photo, drag it onto the canvas.

Next, click on the text of your image and update the content. You can also change the color of text and backgrounds as you desire.

Once you are happy with your creation, download your image by selecting the “download” button in the right corner. Now you can post it to Facebook and promote your webinar!

I look forward to seeing lots of designs in my newsfeed!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Jambo! Veronica Olazabal of The Rockefeller Foundation and Alyna Wyatt of Genesis Analytics here to share our recent experience at the 8th African Evaluation Association Conference (AFREA) held last week in Kampala, Uganda. This event happens roughly every two years and brings together more than 600 evaluation practitioners from across Africa.

The challenges of the developing world have been exacerbated by multiple crises: the global recession, the food and fuel crises, and natural disasters. In response, the nature of poverty alleviation interventions across Africa and the globe has changed. Interventions now often involve multiple components, multiple levels of implementation, multiple implementing agencies with multiple agendas, and long causal chains with many intermediate outcomes – all of this reflecting the complexities of world in which we live. Additionally, details of the intervention often unfold and change over time in ways that cannot be completely controlled or predicted in advance.

To deepen evaluative thinking and practice in response to these trends, The Rockefeller Foundation funded Genesis Analytics to develop and deliver a strand at the AfrEA Conference focused on innovations in evaluation across two main areas: 1) New Forces in Development and 2) New Frontiers in Evaluation Methodology.

The New Forces in Development sub-strand highlighted the emergence of innovative finance in Africa, and how this new trend combines market forces with social goals in a traditional ‘developmental’ context. A discussion on impact investing, hybrid funds, co-mingling funds, social impact bonds and public private partnerships brought attention to how these new forces are entirely compatible and complementary. Through four parallel sessions, participants explored the innovative finance, complexity, market systems innovation and PPPs, and the measurement and evaluation thereof.

While these developmental trends are emerging, and evolving, there is a growing recognition that conventional evaluation approaches may need to be rightsized for these types of designs, and that there is need for measurement and evaluation methods that take into account the multi-faceted and multi-stakeholder complex environment.

The second strand, New Frontiers in Evaluation Methodology, focused on evaluation innovations that are evolving to suit the trends in Africa, while ensuring participation and cultural issues.

The most exciting results emanating from the conference were the enthusiastic conversations had between African practitioners committed to continue to push the frontiers of measurement and evaluation in evolving the development landscape.

Other upcoming international evaluation convening include the EvalPartners Global Evaluation Forum in Kyrgyzstan  (April 26-28) and the Evaluation Conclave in Bhutan (June 6-9) organized by the Community of Evaluators South Asia. Keep your eyes and ears out for the details that will be shared in coming months.

Rad Resources:

  • Interested in learning more about AFREA? See here for additional detail.
  • Stay connected to international evaluation by joining the ICCE TIG here.

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi! I’m Kristin Lindell and I work on the Monitoring, Evaluation, Research, and Learning (MERL) workstream as a part of USAID’s Learning and Knowledge Management contract (LEARN). LEARN helps USAID and implementing partners integrate systematic and intentional collaborating, learning, and adapting (CLA) into their work to  improve development outcomes. In service to supporting USAID and implementing partners, one of our core values on LEARN is “walking the talk” of CLA. We collaborate with  key partners to avoid duplication of efforts; we take time to pause and reflect; and we learn from our work to make adjustments informed by data.

One way we “walk the talk” is through our  MERL cycle, which supports our adaptive management work. Every quarter, my team aggregates key performance indicators from each of LEARN’s five work streams and hosts a participatory, two-hour  discussion reflecting on several key questions: 1) what do these data mean? 2) what should we keep doing that’s going well? 3) what should we stop doing? 4) what should we change? We capture notes from these sessions to share back with the team. These documented conversations then feed into our semi-annual work plans and Performance Monitoring Report. Ultimately, this system helps us understand our progress to date and informs our future work.

The USAID LEARN team pauses and reflects during an annual long-term vision retreat.

Hot Tips:

  • When designing a MERL cycle that facilitates adaptive management, start by asking your stakeholders: What do you want to learn? How will this inform your decision-making processes? When we began this process on LEARN, we had to strike a balance between collecting a sufficient amount of data and actually being able to make decisions with the data. We believe that a focus on learning and decision-making rather than accountability alone helps teams prioritize certain indicators over others.
  • Reflection and learning moments that feed into existing planning and reporting cycles can lead to program adaptations. On LEARN, our reflections on our data influence our six month work plans and management reports. For example, my team recently decided to discontinue a study we had been planning because survey and focus group data showed the study would not yield results that would be convincing to our target audience.
  • If you’re struggling with adaptive management more broadly, consider your organization’s culture. Beyond “walking the talk,” LEARN’s other core values include openness, agility, and creativity. These principles encourage team members to challenge assumptions, be adaptive, and take risks, which all help to cultivate an enabling environment for adaptive management. Ask yourself: does the culture of my organization lend itself to adaptive management? If not, what can I do to change that?

Rad Resources:

  • Want to see what LEARN’s MERL plan looks like? Check it out on USAID Learning Lab.
  • Want to know more about adaptive management and evaluation? Better Evaluation recently pulled together resources about the connections between the two.

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Jindra Cekan

Hello. My name is Jindra Cekan, and I am the Founder and Catalyst of Valuing Voices at Cekan Consulting LLC. Our evaluation and advocacy network have been working on post-project (ex-post) evaluations since 2013.

Lessons Learned:

Most funders and implementers value interventions that have enduring impact beyond the project. We believe that the true measure of sustained impact and effectiveness can be measured only by returning after projects close. Our research indicates that despite more than $5trillion investment in international programming since 1945, fewer than 1% of projects have been evaluated for sustained impact.[1] After searching through thousands of documents online we found fewer than 900 post-project evaluations of any kind (including 370 publically that interviewed project stakeholders who are to sustain results once projects finish. Their views are key if we are going to meet the Sustainable Development Goals by 2030[2], for without such feedback our industry’s claim to do sustainable development falters.

This new type of evaluation is Sustained and Emerging Impacts Evaluation (SEIE).  The focus is on long-term impacts plus both intended and unintended/ emerging impacts post closeout. This guidance comes from our global, growing database of post project evaluations, SEIE consulting and from a joint presentation at 2016’s AEA Conference, “Barking up a Better Tree”.

The guidance outlines:

  1. What is SEIE?
  2. Why do SEIE?
  3. When to do SEIE?
  4. Who should be engaged in the evaluation process?
  5. What definitions and methods can be used to do an SEIE?

Valuing Voices was just awarded a research grant from Michael Scriven’s Faster Forward Fund to do a desk study comparison of eight post-project (ex-post) evaluations and their final evaluations to better demonstrate the value added of SEIEs. The learning does not stop at post-project, as there are rich lessons for projects being currently funded, designed, implemented and evaluated.

Project cycle learning is incomplete without looking at sustained impact post-project, as sustainability lessons need to be fed into subsequent design. Opportunities abound from evaluating sustainability around the cycle:

  • How is sustainability embedded in the funding, partnership agreements,
  • What data is selected at baseline and retained post project and by whom,
  • What feedback about prospects for sustainability is being monitored and how are feedback loops informing adaptive management, and
  • When, how, with whom is project close-out and handover done.

Rad Resources:

The Better Evaluation site on SEIEs including examples from where impact was sustained, increased, decreased or new ones emerged;

Valuing Voices repository and blogs on post-project SEIE evaluations.

Great work on Exit Strategies that includes USAID/ FHI360/Tufts work on exit strategies, UK INTRAC’s resources on NGO exit strategies as well as a webinar on sustained impact, plus Tsikululu’s work on CSR exit strategies.

Underneath our work is a desire for accountability and transparency to both our clients (our donors and taxpayers) and those who take over (the national partners: governments, local NGOs, and of course the participants themselves).

[1] This is based on an extensive scan of documents posted on the Internet, as well as requests to numerous funders and implementing agencies through Valuing Voices’ networks.

[2] Currently EvalSDG is focused on building M&E capacity and amassing data on 230 indicators on indicators such as income, health, education etc but these are unrelated to the sustainability of development projects’ results.

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Kate Goddard Rohrbaugh

I am Kate Goddard Rohrbaugh, an evaluator in the Office of Strategic Information, Research, and Planning at the Peace Corps in Washington, DC. Today I am writing about lessons learned when planning and executing a cross-sectional analysis from studies conducted in multiple countries, and I will provide some cool tricks for writing syntax.

Between 2008 and 2012, the Peace Corps funded the Host Country Impact Studies in 24 countries. In 2016, the Peace Corps published a cross-sectional analysis of 21 of these discrete studies. An infographic summarizing some of the key findings is offered below. To download the entire study, go here.

The data for this study were collected by local researchers. Their assignment was to translate the standard data collection instruments provided by our office, work with staff at Peace Corps posts to add questions, collect the data, enter the data, report the findings, and submit the final products to our office.

Lessons Learned:

  1. Understand the parameters of your environment
    • The Peace Corps is budget conscious, thus studies were staggered so that the funding was spread out over several years.
    • The agency is subject to a rule that limits employment for most to 5 years requiring excellent documentation.
  2. Pick your battles regarding consistency
    • Start with the big picture in mind and communicate that programmatic data are most valuable for an agency when reviewed cross-sectionally.
    • Give your stakeholders some latitude, but establish some non-negotiables in terms of the wording of key questions, variable labels, variable values that are used, and the direction of the values (1=bad, 5=good).
  3. Use local researchers strategically
    • There are many pros to working with local researchers. As a third party, they can help reduce positivity bias, they have local knowledge, hiring locally builds good will, and it is less expensive than using non-local researchers.
    • There are cons as well. There is less inter-rater reliability, a greater need to for quality control, and the capacity to report findings was found to be uneven.
  4. Enforce protocols for collecting end products
    • It is essential that the final datasets are collected and clearly named, along with the interview guides, reports, and codebooks.

Cool Tricks:

Merging multiple datasets with similar, but not always the same, variables is enormously challenging. To address these challenges, rely heavily on Excel for inventorying the data and creating syntax files in SPSS.

The most useful function for coding in Excel is “=CONCATENATE”. Using this command, you can write code for renaming variables, assigning labels, identifying missing values, assigning formats, and so on. For example, for formatting variables in SPSS:

  • Your function would look like this:
    • =CONCATENATE(“formats “,T992,” (f”,U992,”.0).”)
  • But your SPSS syntax looks like this:
    • formats varname1 (f1.0).

After creating a column of formulas for a series of data, you can just copy and paste the whole column into your syntax file, run, and save.

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

<< Latest posts

Older posts >>

Archives

To top