AEA365 | A Tip-a-Day by and for Evaluators

CAT | Graduate Student and New Evaluators

I am Dawn X. Henderson, a past fellow of AEA’s Graduate Education Diversity Initiative (GEDI) and member of the Ann E. Casey’s Expanding the Bench Initiative. I recently developed an undergraduate seminar course in Community Psychology at a Minority Serving Institution. Program evaluation is a competency in Community Psychology and modeling evaluation was critical in passing my evaluation “wisdom” on to a group of “underrepresented” students through a partnership with a nonprofit. I aim to share some hot tips and lessons learned with those interested in teaching and working in evaluation.

Hot Tips:

  • Practice logic models. In preparation of the evaluation report, the class met with the Executive Director to obtain information about the nonprofit, focusing on their programming and key activities. The process of building logic models allowed students to become familiar with services provided by the nonprofit and develop visual connections between inputs, activities, etc.
  • Recognize the individual strengths and knowledge of your students/team. Students worked in pairs to perform the quantitative and qualitative analysis; each pair had a student familiar with the methodology and a weaker student. Weaker students learned new knowledge about data analysis and students collaboratively compiled findings into text and graphs.
  • Divide the report in sections and assign main duties and responsibilities. Each section of the evaluation report had a student leader responsible for collecting information, majority of writing, and maintaining communication with students and faculty. Each student also had to review and summarize an article related to the nonprofit’s programs and services; summaries were integrated into the discussion or recommendation section of the report.

Lessons Learned:

  • Maintain lines of communication on progress with the nonprofit. Maintaining contact with the nonprofit about status, challenges, and their needs can be useful in building feedback and recommendations to improve content. Using this process allows undergraduate students to understand the important role of integrating the nonprofit throughout the process in order to ensure the evaluation report is an accurate representation of their program.
  • Develop timelines for important milestones/benchmarks. The majority of the evaluation report was completed at the end of the academic semester, making it a stressful process for students and myself. Building in benchmarks for each section of the evaluation report would have provided more opportunities for feedback and editing. I literally had to go through the entire report the night before its draft was due to the nonprofit.

The students approached the preparation of the evaluation report with limited knowledge in evaluation but some familiarity in traditional research in psychology. In the end, students discovered ways to translate research processes into evaluation and the nonprofit received useful information to support their programming and funding efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello everyone! We are Indira Phukan and Rachel Tripathy, graduate students at the Stanford Graduate School of Education. This past summer, we worked with a local environmental education organization to pilot several tools for evaluating learning outcomes related to stewardship and environmental behavior change. When it comes to understanding students’ perceptions of and relationships with nature, traditional assessment strategies often fall short, so we looked to creative, alternative evaluation techniques to understand student learning. We also conducted interviews and observations to understand how these tools might inform our research. One of the tools we explored this summer was an art-based embedded assessment.

In our pilot, five- to seven-year-old students participating in a weeklong camp at a coastal California campus were asked to make drawings before and after a scheduled activity where learning took place. The drawing prompts provided by their educators were broad enough to allow for the students to make choices about what they drew, but were also designed to direct student thinking toward the target activity. We collected student art and analyzed it with a rubric that considered thematic, analytic, informational, and contextual details. It was incredible to see the kinds of observations being made by six-year-olds! Their drawings definitely captured learning details that a written assessment would not, and the children had fun in the process. Moving forward, we are excited to see how this tool works with other age groups, and how it might be adopted as an embedded assessment strategy by other organizations.

Hot Tip: Site observations and interviews with educators can help researchers and practitioners design embedded assessments that fit seamlessly into existing curriculum and programming. The educators will thank you, and your data will reflect a more representative student experience.

Lesson Learned 1: When analyzing subjective student work, like art, the type of rubric being used is exceedingly important. The rubric should be well thought-out, and designed to tease out information that will answer your research questions. Ultimately, an effective rubric will go through various iterations during the pilot phase before a final version is decided upon.

Lesson Learned 2: Informative student art takes time. Initially, we wanted to give students about 5-10 minutes to produce a drawing, but we quickly learned that they needed at least 15 minutes to create something that we could properly analyze for insights about their learning.

Rad Resource 1: Rubric Library is a great online resource for browsing rubrics that others have used, and for finding inspiration for creating your own.

Rad Resource 2: Ami Flowers and her colleagues wrote a great article on using art elicitation for assessment. We drew a lot of inspiration from their findings.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello fellow evaluation lovers! My name is Elizabeth Grim and I work as an evaluation consultant with The Consultation Center at Yale, where I primarily consult with community-based agencies to build evaluation capacity. Prior to joining The Consultation Center, I worked as a policy analyst for Connecticut’s statewide campaign to end homelessness.

With the growing popularity of social media, evaluators increasingly discuss data visualization and how to communicate evaluation findings to stakeholders. Yet we don’t always talk about how to effectively communicate within our own teams, which is just as important to the success of a project. Effective communication involves fostering workplaces and teams in which people are heard, understood, and acknowledged for their unique contributions.

Lesson 1: Encourage curiosity: Communication is easier when questions and comments come from a place of curiosity rather than judgment. Ask questions when discussing a project or deliverable rather than jumping immediately to feedback and conclusions.

Lesson 2: Know your colleagues: The first step in fostering better communication is developing a relationship with members of the team. How do your colleagues prefer to communicate? What are their unique skills and professional goals? What are they passionate about inside and outside of the office?

Lesson 3: Table technology: Technology provides us with more flexibility in the workplace and allows us to communicate with partners across the globe. However, technology also allows people to talk around issues, reduces the ability to contextualize information through tone of voice or facial expressions, and encourages multitasking, all of which can result in a breakdown in communication. Ask team members to check their non-essential technology at the beginning of a meeting. Consider providing an incentive like a monthly gift card drawing for those that go low-tech.

Rad Resource 1: 4 Pillars of Integrity Video Series – Make impeccable agreements. Making impeccable agreements means that you only agree to what you are able and willing to complete and that you follow through with your agreements. On the flipside, this means that you also say no to those that you are unable and/or unwilling to complete. Teams are more effective and members have more trust in each other when each member takes 100% responsibility for themselves and their actions.

Rad Resource 2: The Great Genius Swap – Work environments and teams are more effective when people enjoy what they’re doing. Conduct a genius swap with your team. Gather the team together and ask each person to write down the one task they most love to do at their job and the one task they would like to stop doing. Find opportunities for team members to continue doing what they love and explore whether you can swap responsibilities around to minimize those they don’t enjoy.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Megan Olshavsky and I’ve been an evaluator of PreK-12 educational programs for about a year and half now. Before starting my work in a public school district, I was researching learning and memory processes in rats, earning my Ph.D. in Psychology – Behavioral Neuroscience. My experiments were very controlled: the rats exhibited the behavior or they did not, neurons were active or they were not, results were statistically significant or they were not.

Moving from that environment to the “real world” of a school district which employs and serves humans in all their messiness caused some growing pains. How was I supposed to decide whether an educational intervention lead to academic improvement without proper control and experimental conditions?! One of the first projects I’ve worked on is a developmental evaluation of a technology initiative. Developmental Evaluation made me feel ever more flakey –“Hey everyone! Let’s monitor things as they unfold. What are we looking for? Not sure, but we’ll know it when we see it.”

As I’ve transitioned from researcher to evaluator, three things have helped me feel more legit.

Lesson Learned 1: Trust yourself. You may not be an expert in the area you are evaluating, but you do have expertise looking at data with a critical eye, asking probing questions, and synthesizing information from a variety of sources.

Lesson Learned 2: Collaborate with a team who has diverse expertise. Our developmental evaluation team engaged teachers, instructional technology specialists, information systems staff, and evaluators. When everyone on that team can come to the same conclusion, I feel confident we’re making the right decision.

Lesson Learned 3: Embrace capacity building as part of your work. No one would recommend training-up stakeholders to do their own inferential statistics. You can, however, influence the people around you to be critical about their work. Framing is critical. “Evaluation” is a scary word, but “proving the project/program/intervention is effective” is a win for everyone. Building relationships and modeling that expertise we talked about in Lesson #1 leads to gradual institutional shift toward evaluative thinking.

Rad Resource: Notorious R.B.G: The Life and Times of Ruth Bader Ginsburg. Let RBG be your guide as you gather and synthesize the relevant information, discuss with your diverse team, and advocate for slow institutional change.”

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

 Hello! My name is Alana Kinarsky, a PhD student in the Evaluation program at University of California, Los Angeles (UCLA). As a graduate student, I regularly read evaluation theory and conduct my own research on evaluation. However, I often wonder how research on evaluation does or should help practitioners in their daily work.

Cool Trick: To connect research and practice, I organized a pop-up journal club during my summer internship at Informing Change. The journal club gave staff an opportunity to read and discuss evaluation research. I circulated a few theory papers via email and the group elected to read “Evaluation and Organizational Learning: Past, Present, and Future” by Rosalie Torres and Hallie Preskill. The following week, about 10 of us got together over pizza for a facilitated yet casual conversation. Discussing theory can help evaluation practitioners meaningfully reflect on their practice

Lesson Learned 1: Theory offers practitioners a framework and context for their evaluation work. As our conversation of the paper unfolded, we zeroed in on a question that weighs heavily in both theory and practice: what is the role of the evaluator? As people around the table began talking through different roles, I noticed their ideas began to align with the Evaluation Theory Tree developed by Marv Alkin and Tina Christie. I sketched it on the board and walked through the different “branches” of evaluation theory.   The Theory Tree focused our conversation and grounded some of these theoretical elements–like the role of an evaluator–in a visual that was analogous to the roles the Informing Change practitioners recognized in their work.

Lesson Learned 2: These conversations are an opportunity for teambuilding. A conversation about theory creates an opportunity for people from different backgrounds and leadership levels to participate in a shared dialogue. During our discussion, we shared personal stories, current challenges, and ideas for future team conversations rose to the surface. Furthermore, people who rarely work together had the opportunity to collaborate and brainstorm with peers.

Lesson Learned 3: Reflection on practice is important. Our work as evaluators is often fast paced so it is easy to get caught up in execution. However, it is important to make time to reflect on the big picture and think creatively. This not only improves an individual’s practice, but also supports organizational learning.

At the end of the hour, I asked the group to quickly share one takeaway from our conversation. The room was buzzing with energy as people shared what they learned and expressed enthusiasm for continuing this practice. The group agreed that stepping away from their desks to talk about theory offered them an opportunity to reflect, build relationships, and generate new ideas.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! We are Laura Sundstrom and Megan Elyse Williams, Evaluation Associates at the Curtis Center Program Evaluation Group (CC-PEG) at the University of Michigan School of Social Work.

At CC-PEG, we train Master of Social Work students in program evaluation through providing high-quality evaluation services to community-based organizations.  Our students enter our unit with a variety of experiences and skills.  When we were first growing our center, students would get assigned to tasks that were beyond their skill level out of project need.  As a result, we developed the Tiers of Skill Development to guide students logically and intentionally through their skill development and professional preparation.

Hot Tip: Make it applicable for your context. We developed these Tiers based on the skills needed to be successful within our Center.  Your organization may value different skills or use a different order of development.

Cool Trick: There are many uses for the Tiers – be creative!

  • Orientation to evaluation. Helping students understand all of the different components and skills that go into evaluation practice.
  • Supervision and mentoring. Working with students to assess their self-efficacy in these skills and where they have practiced these skills in project work.
  • Project management. Helping lead evaluators assign tasks that challenge students but are not out of their reach.
  • Identifying trainings. Skills that many students have not had a chance to develop may be appropriate for a larger training.
  • Personal development. Assisting students in their professional development, advocating for their own learning, and in their job search.

Lessons Learned: After using the Tiers for over a year, we have learned a lot!

  • Project work cuts across the tiers. Students don’t have to complete one full tier before moving on to the next. They can develop skills in certain strands of work – such as qualitative data collection and analysis.
  • Response set is important for understanding “mastery” of a skill. Highest level on the Tiers is “can teach someone else to do it.”  This helps contextualize for students what “mastery” of skill means in the professional world.
  • Identify peer support. Identify students that are ready to work towards “mastery” of a skill and pair them for peer support with another student who needs training.
  • Skill development rather than self-efficacy. The Tiers focus on demonstrating skill development rather than reporting self-efficacy.  Students will be able to point to specific tasks where they practiced a skill instead of saying they are confident in their skill level.

SundstromAEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Sophia Guevara and I am a recent graduate of Wayne State University. I am writing about my experience identifying and collaborating with other evaluation leaders and professionals to motivate them to take action around an important issue.

For the Chicago conference, I worked with the leadership of several Topical Interest Groups to develop a toiletry collection for a homeless women’s shelter. As this was the second year our TIG partnership asked AEA for permission to collect toiletries for the homeless, there were some important lessons learned from the project.

Lesson Learned 1: Build awareness to motivate others to take action. With the leadership of the Social Network Analysis, Nonprofit and Foundation, Social Work and Alcohol, Drug Abuse and Mental Health TIGs informing their members about the collection, many AEA members knew that we were collecting free travel-sized toiletries that were placed in attendees’ hotel rooms. With mentions in newsletters and posts on LinkedIn Groups, the collection netted a box of donations from generous American Evaluation Association conference attendees.

Lesson Learned 2: Sometimes you will find your best partners through recommendations from those you connect with. In the beginning, I emailed the leadership of larger Topical Interest Groups to gain the support I thought I needed to make the proposal successful in front of the American Evaluation Association. These contacts recommended I also contact the leadership of smaller Topical Interest Groups whose focus was closely aligned to those who were experiencing homelessness.

Lesson Learned 3: Seek expertise from those who may know an area better than you. The idea of collecting toiletries this year for Deborah’s Place came from a professional contact I know who works at a Chicago-based foundation that has supported several activities related to homelessness. Since I was not from Chicago, I reached out to this person who was able to use her expertise to recommend the organization.

Rad Resource 1: The American Evaluation Association Community provides an opportunity to identify the number of members enrolled in Topical Interest Group communities. By researching potential Topical Interest Groups to contact to gain their support, I initially focused on accessing larger TIGs first and then identified others by whether or not their topic of interest might be related to the issue of homelessness.

Rad Resource 2: The AEA Topical Interest Group List provides visitors with contact information for Topical Interest Group leadership and a direct link to each group’s website.

The Graduate Student & New Evaluator TIG is sponsoring this week’s AEA365, so be sure to check out the blogs this week! It’ll be worthwhile for new and seasoned evaluators alike!

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hi! My name is Denise Ramón. I am a doctoral student in education at the University of the Incarnate Word in San Antonio, Texas and work at the Center for Civic Leadership that focuses on civic engagement and leadership. More specifically, I help to connect my university to the community. I am interested in Asset Based Community Development (ABCD).

Lessons Learned: While at the AEA 2014 Denver conference, I attended a session that was of particular interest to me, Altschuld, Hung, and Lee’s Getting Started in an Asset/Capacity Building and Needs Assessment Effort. Two dichotomous philosophical approaches were presented, needs assessment and asset / capacity building (A/CB). One of the main ideas stemming from this presentation was to create a hybrid framework between needs assessment and asset mapping. If evaluation is evolving to be visionary and sustainable, mixing traditional models, such as needs assessments, with newer ideas, such as capacity building and asset mapping, seems rather logical. This way, the best of both worlds can be extracted and can fill each other’s gaps, one can complement the other rather than being at odds. With this innovative notion, more research is needed to see if a model can really be developed and effectively implemented.

Coming to my second AEA conference enhanced my network system. I participated in most of the social events hosted by AEA, such as the TIG social events, the poster presentation session, and the silent auction. Getting to know others in the field gives me confidence to participate in more evaluation activities because I know I can ask for help and turn to other veterans with more expertise. Lesson learned: Jump in to AEA with confidence and an open mind. Reach out to others. Network.

Rad Resource: Using the AEA Public elibrary to find the presentations was so very useful for me. I was able to download the presentations and can now possibly use the document as a reference for my research. I highly recommend using the AEA e-library. You can also upload your own presentation and documents. It is another way to promote your work.

As a doctoral student and novice to the evaluation field, the mere experience of attending the conferences has enhanced my overall learning and understanding of evaluation. Not only have I learned about new resources to tap into, like the e-library, but I have been able to relate newly learned evaluation concepts to other parts of my professional and academic life and research. This has been in part to having made new connections.

We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Çigdem Meek, Bashar Ahmed, and Marissa Molina, PhD students at the University of Incarnate Word in San Antonio, Texas. As novice evaluators, we would like to share what we have learned from our experience of attending the 28th Annual Conference of the American Evaluation Association in Denver.

Lessons Learned:

  • Attending the conference as a group of PhD students from the same university eased our anxiety of being among expert evaluators. Plan with your peers to attend the next conference in Chicago!
  • Stay at the conference hotel (and make your reservation as soon as possible). You will not regret the networking opportunities it provides!
  • Attend pre-conference and post-conference workshops! Evaluation 101 is a great workshop to understand the basics of evaluation.
  • Join Topical Interest Groups (TIGS) business meetings. Meet with like-minded evaluators!
  • Look for volunteer opportunities, especially if this is your first time. This helps you meet with other evaluators with ease (and also helps with the registration cost).
  • Participate in panel discussions. This is an excellent way to meet and learn from other evaluators.
  • Do NOT miss the opportunities to learn from the best through panel discussions, workshops, and conference sessions! (i.e. Donna Mertens, Robert Stake, Stafford Hood, Rodney Hopson, Hazel Symonette, Jody Fitzpatrick, Michael Scriven, Michael Patton, Art Hernandez, Karen Kirkhart, and Cindy Crusto have facilitated excellent sessions and provided exceptional insights for novice evaluators).
  • Make sure you have your business cards (a lot) with you and exchange! Remember to take notes on cards you receive (I thought I could remember all!). In order to stay connected send them a brief email within 10 days after conference.
  • Take notes to review later during the sessions and reflect on what you learn. Remember, reflection is what makes learning meaningful.

Rad Resources: Check out these resources before attending the conference!

We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, my name is Erica Roberts, an AEA GEDI scholar, doctoral candidate at the University of Maryland School of Public Health, and an intern at the National Cancer Institute Office of Science Planning and Assessment. As a graduate student who is approaching the transition from student to professional in the field of public health evaluation, I would like to share with you the lessons I learned from attending the AEA conference in the hope that these lessons can be used by other graduate students planning to attend next year’s conference.

Lesson Learned: Prepare to build your professional network. The AEA conference provides an expansive and rare opportunity to meet evaluation experts, future mentors, and possible employers. Prior to attending the conference, use the Topical Interest Groups (TIG) to navigate the conference program and identify experts in your field of interest. Remember to pack business cards and update your resume or vitae. Once at the conference – be bold! Introduce yourself to presenters from organizations or fields of practice that interest you and have a few talking points or questions prepared. Once you’ve connected, add their information into an Excel spreadsheet and, after the conference, note if and when you follow-up via email and the outcome of your discussion. This will help for professional networking down the road!

Lesson Learned: Prepare to be overwhelmed (but in a good way). Before arriving at the conference, figure out a way to stay organized that works best for you. I brought my iPad to each session and used the EverNote app to take notes. Most importantly (to my organization), I kept a “to-do” note where I listed everything I wanted to do when I returned home (e.g., articles to read, experts to connect with, student scholarships or job opportunities to apply for). It is likely that you will encounter a lot of information that you want to know more about but do not have the mental space to process – this is where making a “to-do” list for home comes in handy!

Lesson Learned: Prepare to be inspired. You may find at the AEA conference that the ways to approach evaluation are endless – depending on the field, the context, the purpose, etc. Do not let this discourage you; rather – let it inspire you. Take these ideas and put them in your back pocket and know that at some point you may be asked to conduct an evaluation and you will have a myriad of methods and approaches to look to. I encourage you to use the AEA conference to learn about approaches to evaluation that you are not familiar with, and identify ways in which those methods could be adopted to your work!

We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

 

Older posts >>

Archives

To top