AEA365 | A Tip-a-Day by and for Evaluators

TAG | collaboration

Hello, we are Chithra Adams, Director of Evaluations at the Human Development Institute at the University of Kentucky, and Leah Goldstein Moses, CEO of The Improve Group, a research and evaluation consulting firm based in St. Paul, Minnesota.

There is an African proverb that says, “If you want to go fast, go alone. If you want to go far, go together.” This week, we are curating a series of posts about the efficacy and potential pitfalls of working in teams. This post offers reflections on some key lessons learned in leading evaluation teams.

Lessons Learned:

  • Different situations call for different types of leadership. But some leadership behaviors may conflict with each other. For example, a leader might adopt an open and facilitative mindset during a brainstorming session. On the other hand, a task-oriented mindset might be needed to ensure that an evaluation project is implemented on time.
  • Teams adopt different strategies along a continuum to getting their work done – from full co-creation and collaboration to splitting tasks among team members. It can be useful to determine what strategy is best suited for the project at hand. If teams fall into a habit of working the same way all the time, they miss out on the benefits of other ways of working.
  • The level of team functioning contributes to both the success of a project and the experience of a project. When your team works well together, you can get more done, your team can go into greater depth by attending to various perspectives, AND the members of the team can have a great time with each other.
  • Finally, sometimes it takes a while to see the impact of evaluation. So take time to celebrate incremental successes and practice gratitude.

Hot Tips:

  • Like any skill, leadership improves with practice. Use time to reflect and foster self-awareness about your own responses and how they may influence your team’s dynamic. Remember to be kind to yourself and forgive your mistakes.
  • Likewise, extend this attitude of kindness and forgiveness to team members. Many teams adopt norms or values as they form. “Assume good intent” can be a useful motto.

Rad Resources:

Could clarifying values help your team? Try this exercise from Entrepreneur Magazine around developing your team’s shared values.

If you feel your team is in a rut, this blog lists great activities for team-building.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Claire Stoscheck, a Research Analyst with The Improve Group, an evaluation consulting firm based in Minnesota. I wanted to share with you an important part of this work, and one I am constantly reminded of – how creating space for clients to contribute their specialized knowledge leads to innovation and great ideas in evaluation projects.

We are currently building capacity for evaluation with the Lincoln Park Children and Families Collaborative (LPCFC) to evaluate the healthy spaces it is developing in the Lincoln Park neighborhood of Duluth, Minnesota. The Collaborative is integrating Commercial Tobacco Free Zones with spaces that inspire and facilitate healthy eating and physical activity for a wraparound healthy experience for community members.

Lesson Learned: From the start, the Collaborative’s executive director, who does have evaluation experience, emphasized the community expertise among staff – we took this as our cue to design the first capacity building workshop in a way to maximize that knowledge. We provided basic evaluation training that included hands-on activities, followed by co-creation of data collection methods, which maximized participation and was inclusive of a diversity of perspectives. Through thoughtful facilitation, staff were able to provide rich and relevant ideas, like including photo-stories as a data collection method, and adding the evaluation question of how a community defines health for itself. The cultural and experiential expertise that LPCFC staff and volunteers brought to the table was synergistic with the technical expertise that I brought.

Together, we created interactive surveys for youth with methods like dot voting, and identified well-attended weekly dinners as a possible strategy for data collection – again, tapping into the knowledge of local program staff. In this way, we all benefited with a better evaluation based on shared knowledge. In the fall, we will do another capacity-building workshop together, this time focused on participatory data analysis.

Lesson Learned: The LPCFC project also reminded me how people who aren’t evaluators can be just as passionate about evaluation. To create that culture of collaboration – especially in evaluating culturally specific programs – it is crucial to approach projects as equals, understanding that clients bring information that is as valuable – or more – as the tools we bring as professional evaluators. Along the way, we as evaluators can learn very applicable skills from our clients – a humbling reminder.

LPCFC is becoming infused with an evaluative culture, as highly engaged staff members realize another way they can use their community expertise. When communities build their own evaluation and research skills, they can more easily design and implement evaluations and research projects that answer key questions emerging from their own communities.

The American Evaluation Association is highlighting the work of The Improve Group. The contributions all this week to aea365 come from staff of The Improve Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Sophia Guevara and I am the Special Libraries Association Information Technology Division (SLA IT) Chair and American Evaluation Association Social Network Analysis Topical Interest Group Co-Chair. At SLA IT, I currently lead the Executive and Advisory boards. In an effort to bring the members of these boards together, I asked that the board work collaboratively on presentation using Google Slides in order to showcase the accomplishments as a team.

Rad Resource: Google Slides

I chose Google Slides as I had experience collaborating with others using the Google Docs tool. Creating a slide document was quite easy and after developing introductory slides, I inserted blank slides for each member of our executive and advisory board. This was done to provide each participant with an opportunity to share his/her accomplishments over the past few months.  Using Slides’ sharing option, I emailed an invite that provided edit access to each board member.

Rad Resource: Freeconferencepro

Once the presentation was developed, we used Freeconferencepro to deliver the presentation in conjunction with Google Slides.  For those who are unaware of this tool, it provides you with an opportunity to develop slides for free either by yourself or with others you choose.  This allowed board members, conference attendees, and others to access the information without regard to where they were located. In addition, for those that were unable to attend this meeting, Freeconferencepro’s recording option allowed me to develop a meeting recording that others could view at a later time.

Lessons Learned

The project required several follow-up reminder emails encouraging each board member to complete his/her slide. In these reminders I included a link to view the presentation, however, this seemed to confuse some who let me know that the link provided gave them no permissions to edit. The lesson learned was to send a reminder with a link with edit permissions so that it wouldn’t confuse those that were being reminded to complete their slide.

With that being said, one board member indicated that while he did not have experience with Google Slides prior to this project, he had previously used Google Docs and found that it was very similar. In addition, after the experience, his opinion of this tool was that it was an “effective way to communicate main points of a discussion or reports” and that the combination of Google Slides and Freeconferencepro was an effective way to share information among a distributed group.

The American Evaluation Association is celebrating Nonprofit and Foundations TIG Week with our colleagues in the NPF AEA Topical Interest Group. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Rebecca Woodland and I have had the pleasure of working to evaluate and cultivate organizational collaboration in a range of contexts and for many different purposes. In this post I’ll share tips that evaluators can use in the developmental, formative, and summative evaluation of inter-organizational and inter-personal collaboration. I’m excited to be delivering an AEA 2015 pre-conference workshop that goes into detail about these hot tips – maybe I’ll see you there!

Hot Tip #1 – Make collaboration less messy. Though ubiquitous, “collaboration” persists as an under-empiricized concept. One of the first things that evaluators looking to assess collaboration will need to do is to operationlize the construct. Familiarize yourself with collaboration theory and find specific suggestions for facilitating a shared understanding of collaboration in…Utilizing Collaboration Theory to Evaluate Strategic Alliances, and the Collaboration Evaluation and Improvement Framework. 

Hot Tip #2 – More collaboration is not always better. Levels of integration between organizations matter, but the scope and scale of integration should match the purpose and goals of the alliance.

  • The Levels of Organizational Integration Rubric (LOIR) describes five possible levels of inter-organizational integration and the purposes, strategies/tasks, leadership/decision-making, and communication characteristics that tend to be present in each. Use the LOIR to measure and cultivate ideal levels of inter-organizational collaboration.

Hot Tip #3 – Avoid “co-blaboration.” The evaluation of inter-personal collaboration can help organizational stakeholders avoid “collaboration lite,” whereby mere congeniality and imprecise conversation are confused with the type of disciplined inquiry vital to the diffusion of innovation and attainment of essential outcomes.

  • The Team Collaboration Assessment Rubric (TCAR) describes four fundamental elements of high-quality inter-personal collaboration: dialogue, decision-making, action, and evaluation. Evaluators are encouraged to adapt and administer the TCAR in ways that are most feasible, useful, and appropriate for the context of their program evaluation.

Hot Tip #4 – Use Social Network Analysis (SNA) methods (if you don’t already). SNA is a sophisticated, yet accessible, means for assessing organizational collaboration. Evaluators can use SNA to mathematically describe and visually see how “ties” between organizations or people form, and how these “links” may influence program implementation and the attainment of desired outcomes.

Rad Resource:

Coalitions that Work® offers excellent tools for evaluating coalitions and partnerships that are available in .pdf format.

Want to learn more? Register for Evaluating and Improving Organizational Collaboration at Evaluation 2015 in Chicago, IL!

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2015 in Chicago, IL. Click here for a complete listing of Professional Development workshops offered at Evaluation 2015. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Susan Wolfe and I am the owner of Susan Wolfe and Associates, LLC, a consulting firm that applies Community Psychology principles to strengthening organizations and communities. A lot of my work involves helping to develop and evaluate community coalitions.

Because of the dynamic nature of coalitions facilitating their development and providing evaluative feedback can require a range of approaches and tools. It requires looking inside the group to assess the dynamics and whether the members are working well together, and outside of the group to continuously scan the context for changing political, economic, and cultural factors and to determine whether the coalition is having impact or fulfilling its purpose.

Hot Tip: Before you begin to evaluate a coalition, spend time attending meetings and getting to know its membership. Find out as much as possible about the politics, member organizations, community, and its purpose. Determine whether there are hierarchies among organizations and groups and uncover agendas and interests.

Rad Resource: One of my favorite coalition-related books is The Power of Collaborative Solutions by Tom Wolff. In addition to Wolff’s six principles for creating collaborative solutions for healthy communities, the book includes an Appendix on Evaluation with tools and access to a website with even more tools for developing and evaluating coalitions. Also check out Tom’s web site at www.tomwolff.com for more resources and sign up for his Collaborative Solutions newsletter.

Hot Tip: Use a range of methods to provide feedback to coalition leadership and members. For example, combine results of a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis with those from a member survey.

Lesson Learned: When you present the results of a survey use an interactive approach to encourage dialogue about the findings. Include time for members to talk about what they liked and did not like about the survey instrument itself and whether there are unanswered questions.

Rad Resource: Attend the Society for Community Research and Action’s (SCRA) Summer Institute workshop on Developing and Evaluating Coalitions which will be facilitated by Tom Wolff and Susan Wolfe on June 24th in Lowell, MA at the UMass Lowell Inn and Conference Center. Stay around for the SCRA 2015 Biennial Conference from June 25th through 28th. More information can be found at www.scra27.org.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Ann Price and I am the President of Community Evaluation Solutions, Inc.(CES), a consulting firm based just outside Atlanta, Georgia. I am a community psychologist and work to infuse environmental approaches into my work developing and evaluating community prevention programs. Much of my work involves working with community coalitions.

Hot Tip: Appreciate how long it takes for community coalitions to mature. Often, community members want to jump in and get right to work. However, the first thing community coalitions need to do is develop structures and processes that will help ensure their long-term success. It may be helpful for you to work with your coalition to develop a logic model that details what steps the coalition needs to take in order to be successful. Here is one example from our work with the Drug-free coalition of Hall County, based on a model by Fran Butterfoss and Michelle Kegler’s Community Coalition Action Theory (2002). Having this Logic Model helped coalition members focus on establishing a good foundation and to recognize the importance of planning and evaluation.

Ann Logic Model

Rad Resource: Fran Butterfoss’s book, Coalitions and Partnerships in Community Health (2007), is a great reference book for coalition leaders, researchers and evaluators. It includes surveys that coalition leaders can use to assess the health of their coalition.

Rad Resource: Fran Butterfoss has a new book, Ignite! Getting Your Community Fired Up for Change, an excellent and accessible resource for coalition leaders and members filled with tips to inspire coalitions to action. 

Hot Tip: Community Anti-Drug Coalitions of America (CADCA) is another good resource for both coalitions and evaluators. They host The National Leadership Forum each December in Washington, D.C. and the Mid-year Training Institute held at various locations around the country. Both meetings include one-to one coaching for coalition leaders and a separate tract for youth, the National Youth Leadership Initiative.

Lesson Learned: “Evaluation as intervention” is a concept I have been pondering lately. When you find your coalition is stuck in a “meet and talk” rut, think about redesigning the evaluation to focus on the environmental change strategies the coalition has implemented and the community reach of each strategy. Work on documenting the link between their chosen strategies and community outcomes. Then, use evaluation data to provide more timely feedback to the coalition. This would be a great opportunity to involve coalition members in discussions about where they are, where they would like to be and how, working together, they can get there.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello friends in evaluation, my name is Rebecca Woodland and I’m an associate professor of educational leadership at UMass Amherst. I’ve been a contributor to AEA in a variety of ways on the topic of evaluating and improving organizational and inter-professional collaboration. I’m especially passionate about using evaluation to cultivate meaningful teacher collaboration in PreK – 12 school settings. In this post I’d like to share some tips and tools for assessing teacher collaboration. Evaluators can use these tools to help stakeholders avoid “collaboration lite,” whereby mere congeniality and imprecise conversation is confused with the serious professional dialogue vital to instructional change, student learning, and school improvement.

Hot Tip – K-12 educators are passionate about teacher collaboration, and know that it is the vehicle to instructional improvement. Unfortunately, the term collaboration, although ubiquitous, persists as a messy (under-empiricized, under-operationalized) construct. Fortunately, evaluators are uniquely positioned to help stakeholders make sense – to raise shared literacy – about what teacher collaboration ideally looks and feels like.

Rad Resources – 1) Evaluating and Improving the Quality of Teacher Collaboration: A Field-Tested Framework for School Leaders ©2008 NASSP Bulletin

and 2) Evaluating the Imperative of Inter-Personal Collaboration: A School Improvement Perspective ©2007 American Journal of Evaluation. (http://aje.sagepub.com/content/28/1/26.short)

Co-authored with my colleague Chris Koliba, both present a theoretical frame for inter-professional collaboration, and specific suggestions for how evaluators can facilitate shared stakeholder understanding of collaboration.

Hot Tip – Collaboration can be operationalized (and measured)! Teacher collaboration entails on-going cycle of dialogue, decision-making, action and evaluation, through which teachers build their knowledge and skills and make targeted changes to classroom practice – the primary factors attributed to improvements in student learning.

Rad Resource – The Teacher Collaboration Assessment Survey (TCAS). The TCAS is a validated instrument for the systematic assessment and targeted improvement teacher collaboration. Evaluators can use this tool in a variety of ways to evaluate the process and outcomes of teacher collaboration. Access the TCAS in: Woodland, et al. (2013) A Validation Study of the Teacher Collaboration Assessment Survey in Educational Research and Evaluation: An International Journal of Theory and Practice. ()

The evaluation of teacher collaboration can help build educator capacity to recognize and strengthen attributes of teacher teaming, and to make systematic, evidenced-based improvements to instructional practice that lead to greater student learning.

See you in Denver for AEA 2014!

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rebecca? She’ll be presenting as part of the Evaluation 2014Conference Program, October 15-18 in Denver, Colorado.

·

Greetings!  We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group).  We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union).  And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.

Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis.  Some practical considerations include:  the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline.  How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.

Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team.  Before the full team dives in, we:  review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.

Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential.  Some keys to success include:

  1. Invest in building the capacity of all team members.  We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
  2. Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously.  Plan to have the full team thoughtfully review sub-team work.
  3. Stay open to shifting in response to the team’s expertise and needs.  An empowered team will guide the process in ever-evolving ways.

Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.

We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process.  We wish the same for you.

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Cassandra O’Neill and I’ve been a consultant for the past 10 years. I’m a member of a network of consultants and coaches called Wholonomy Consulting.  I’m also the President-Elect for the Arizona Evaluation Network and a member of the AEA Local Affiliate Council. A theme in my work is using effective engagement for high impact collaboration.  I have several resources to share with others interested in increasing the effectiveness and impact of collaborations.

Rad Resource: In 2008 the book Forces for Good by Leslie Crutchfield and Heather McLeod Grant was published. This book studied 12 high impact non profits and found that there were 6 practices which led to their high impact.  These six practices are as follows:

  • Serve and Advocate. Partnering with others was essential to doing both of these well and led to high impact.
  • Make Markets Work. By partnering with corporations these high-impact nonprofits were able to shift corporate practices and work jointly with businesses toward a social good. Many also operated earned income ventures which provide stable funding for their work.
  • Inspire Evangelists. By connecting people with a way to act on their passions, high-impact nonprofits generated powerful and enthusiastic supporters who recruited others.
  • Nurture Nonprofit Networks. These nonprofits helped their peers succeed by continuously asking how they could help others benefit from their own organization’s strengths and knowledge, and this resulted in increased value for all.
  • Master the Art of Adaptation. Constantly assessing the results of their actions, gathering input from a wide group, and applying what they learned in a meaningful way led to high impact.
  • Share Leadership. Strong leadership was present in these nonprofits who had strong a.) Executive Directors, b.) seconds in commands, and c.) boards. Their benches are deep, which allows for collective leadership to emerge and promotes sustainability.

I developed the following worksheet to help people think about these practices, how they might be currently using them, and how they can build on these successes. You are welcome to use this in your work. http://www.wholonomyconsulting.com/docs/six-keys-to-high-impact-worksheet.pdf

2014 Update: Additional Rad Resources: 

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi there! I’m Ann Martin, a postdoctoral fellow and internal evaluator with NASA Innovations in Climate Education, which funds climate education projects as part of NASA’s Minority University Research and Education Program (MUREP). I’m also part of a cross-agency collaboration involving sister programs at the National Science Foundation (NSF) and the National Oceanic and Atmospheric Administration (NOAA).

This collaboration represents more than 100 projects that have received funding to conduct climate education projects in formal and informal environments; each project funds its own evaluator and determines its own evaluation plan. As part of that tri-agency effort, I’ve helped to facilitate a community of these evaluators. Throughout this week, the AEA365 blog will feature posts from members of our community, and what we’ve learned about evaluation of climate education.

This tri-agency evaluation group is entirely grassroots, depends on the efforts of its members, and functions with extremely limited resources. To kick off Climate Ed Eval Week, I’ll be sharing some thoughts on how to help a community like this work.

Lesson Learned: In April 2012, a large group of almost 40 tri-agency evaluators and funded project leaders got together to work on a common evaluation vision for climate education. The result was a draft logic model describing our portfolio of diverse projects. We found that the process of drafting the model, and negotiating which terms and concepts belonged, was as useful as the product itself. Each project has its own goals, and we worked together to resolve and align those into a representation of what the three agencies are working towards. This also started a long-term conversation, and helped us to identify challenges and opportunities. We’ve also found that evaluators are hungry for a place to share and find evaluation resources, instruments, and reports relevant to their sphere of interest – a place that won’t go away when funding does. We’re seeking solutions to this!

Clipped from https://nice.larc.nasa.gov/tri_pi/

Cool Trick: While meeting in-person got our grassroots evaluation group off to a roaring start, it’s tough to get together. Instead, we take advantage of opportunities to hold lunches or meetings at conferences like AEA, AERA, and AGU (going on right now!). This also helps us bring new evaluators and their perspectives into the fold.

Hot Tip: Online collaboration tools help us keep the community going. Our group uses Google Drive to share documents, and we’ve also looked into Sign Up Genius. This handy service allows participants to sign up for tasks (instead of time slots, like Doodle does).

Get Involved: If your evaluation work relates to climate education, and you would like to learn more, contact me at ann.m.martin@nasa.gov. Also, consider joining the STEM Education & Training TIG!

The American Evaluation Association is celebrating Climate Education Evaluators week. The contributions all this week to aea365 come from members who work in a Tri-Agency Climate Education Evaluators group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top