Hi! My name is Jan Noga and I am the owner of Pathfinder Evaluation and Consulting, in Cincinnati, Ohio. I’m a developmental and counseling psychologist with a specialization in early and middle childhood. I’ve spent a lot of time in preschools – as a teacher, researcher, observer, and evaluator. I learned from some amazing advisors and mentors – Eleanor Maccoby, Albert Bandura, Walter Mischel, and Urie Bronfenbrenner – invaluable in my work with young children.
Lesson Learned: As an evaluator, I’ve run surveys with children as young as three. I’ve gone in as a solo evaluator on a project, and I’ve also managed teams of evaluators spread out across multiple schools. The one element that I always, always insist on is prior experience working with young children in group situations. This often rules out undergraduates (and even many graduate students), simply because they haven’t worked with more than one or two young children at a time. The same goes for parents – having a toddler or preschooler does not necessarily qualify you to do data collection with a classroom of young children. The people I turn to the most often are those with preschool teaching experience – they’ve got the most experience managing a diverse group of children.
Hot Tip: Surveying young children is best done as either a one-on-one or small group activity. I’ve found that a group of four or five children is the maximum size; any more than that and the kids will run the zoo instead of you!
Hot Tip: Keep your questions simple in structure and few in number, if only for your own sanity. Preschoolers and kindergarteners can handle up to 12 questions, but anything beyond that and they will quickly get tired of the game. Kids in first through second grade can handle up to 20. Remember, you’re going to be doing this over and over again with multiple groups – how many questions do you really want to read each time?
Lesson Learned: Another issue you might encounter with very young children is developmental diversity around language and the ability to express thoughts and ideas. I find that three-year-olds tend to be a little young to understand abstract concepts but can respond to very concrete questions (“Was this fun?,” “Was this hard?,” “Do you like to do art by yourself or with friends?” etc.). Four- and five-year-olds will still vary in terms of developmental levels, but can respond to very basic abstract concepts. Keep questions as concrete as possible. If language and abilities are not there, you may be better off using observation in structured situations, such as a play scenario.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Dear Ms. Noga,
As a grade one teacher, I could not agree more with your sentiments! An evaluator needs to be able to handle a group of children and know how to engage with them.
I often do in class assessments in small groups of no more than 5 students. I prefer to complete them one to one, but that is not always possible. When I do have up to 5 students, I prefer to group them in specific ways so that students who require more support are with students who are more independent. In this way, if students need behavioural redirecting, I do not need to redirect the entire group; only a few students. In some cases and with some groups, I do not have more than 4 students because they may need more redirecting and attention. Sometimes it is easier to have more groups of fewer students rather than fewer groups of more students.
We also have to consider that since they are so young, they often want to connect and share their own experiences and stories which take up time. This is something that evaluators need to be mindful of when conducting surveys with time constraints. From experience, completing tasks with small children will always take longer than expected!
Additionally, I appreciated when you mentioned developmental diversity around language. If children do not properly understand the questions being asked, then the responses they give will not be valid. This reminds me of a school-wide writing assessment we did last year. The prompt was “What makes a good friend?”. My young students had a very difficult time because they took the word “make” in a literal way. They understood it as “How do I make a good friend”, which is not accurate. Therefore, their responses did not match up to the criteria for marking this assignment. As a result, this skewed the results of the school-wide writing assessment. I think if we asked the question “What does being a good friend mean?”, it would have been more clear and is more of the language that they use on a day to day basis.
Thank you for your thoughts!
Sara
Hi Ms. Noga,
I am a primary teacher and currently a Graduate Student at Queen’s University. I came across your article while researching for my course project of evaluating a nutritious breakfast program. I was struggling with how to survey young children to see if they enjoy the program.
Young participants of the program would need help reading and filling out the survey questions. I agree that this would be best done in a one on one or small group activity. Large groups can quickly become overwhelming like you point out. I was surprised to see that young children can handle up to 12 questions. In my practice it is capped around 5 questions and then they are off to the next thing or have lost focus.
I had originally planned on having volunteers of the program from the community help the children with the surveys. However, after reading your thoughts on only allowing someone with prior experience working with young children in group situations, I have changed my mind. Instead, I will enlist the help of the teachers who oversee the program in the school. Their experience working with children will be much more beneficial to the task. I have worked with many volunteers who lacked experience with groups of children and it really makes a difference in how a task is completed with suitable supervision.
I’m left wondering now if I should change my survey questions to something more simple like you suggest. Perhaps using pictures of a smiley face or sad face and having students circle how they felt about the meal provided. Or having the teacher show the group pictures of past meals and ask the small group to give a thumbs up or down on how they felt about it. This would answer if they are happy with the meals provided but I would still need their input on the program and what they like or dislike about it. For those questions, I will follow your advice to keep them structured simply and concrete as possible.
My one outstanding question is regarding food waste from participants. I was hoping to have volunteers within the program report on food being thrown away or unfinished from our participants. Do you feel this requires someone with experience with children to report on, like the teachers supervising the children eat? Or would the input from the volunteers who are observing form the background be sufficient?
Thank you for the great article and I look forward to any feedback you can offer,
Sarah
Hi, Sarah,
So glad you found this interesting and helpful. A key reason I was able to do 12 questions was that the entire survey activity was done as a small group game and the children were physically involved in responding. I kept groups to 4-5 children at once. The use of the bingo cards and stickers was also important – young children can place stickers with greater ease than draw circles around images. It also keeps the instrument (the bingo cards in my case) less crowded. If you are going to do questions that require they circle a response (like thumbs up or down), you will need to figure out a way to set each question’s response set apart from the others – perhaps a set of boxes with an identifying number or letter. Whatever, you do, don’t include the questions themselves on the response instrument – it will only confuse the kids and they will get distracted trying to read the words. I do not use written surveys with kids until at least age 12-13.
I have found that smiley and sad faces have poor validity and reliability. Happy faces are pretty easy for kids. The opposite is more difficult. Using a sad face may be interpreted as this food makes me sad. I’d use a yucky face instead – it’s more universally interpreted as “I don’t like this” and is probably better used when talking about food likes and dislikes. Thumbs up or down would be best.
In terms of food waste – that’s tricky. Where and when is food waste being measured? Is it an overall measurement or are you interested in what each individual child is throwing out? Is an adult standing and watching kids eat? Or standing at the waste bin when they clear their food? That could be very intimidating to children and could affect the authenticity of their consumption. If they are worried about being judged, it’s possible that they will be more likely to eat or trade the food they don’t like rather than have a stranger watch them throw something away (depends on the stance their parents take at home). On the flip side, it could create enough anxiety that the child doesn’t eat at all. Even teachers could be seen as being potentially judgmental, depending on the relationship each child has with each teacher. The best way to approach tracking food waste is to have small bins for each type of food. Put a picture on the front so kids can match up food with picture. Then, when they clear, have them put any uneaten food in the matching bin. This is assuming kids do this themselves. If adults do all the clearing and disposal, it’s moot. They can track as they clear.
If you have any further questions, feel free to contact me. My website is http://www.pathfinderevaluation.com and there is a contact form on that page that comes directly to my email.
Hi Jan,
Thank you so much for your detailed response.
I will take your advice, and stick with pictures and game activities for my young participants. They will enjoy this more than a multitude of questions and I can get the results I need.
Your idea about the food waste tracking and bins is a great idea. I wonder if the kids would stick to it though? Worth a try. Otherwise like you suggested I could ask the volunteers to track the food waste.
Thanks again,
Sarah
Hi Jan Noga,
My name is Crystal and I am an undergrad psychology student at Texas A&M Central Texas. I also work fulltime as a preschool teacher at a child development center. My day to day casual observations of my students, never occurred to me to run surveys. My interest in childhood education.
As far as the hot tips go, I agree to get anything remotely done or to get decent information, small groups or one on one are best. And I am sure you have been there, running large group activities can be a zoo and the children will take over quickly. I incorporate singing and dance. Upon observation I can tell who doesn’t want to do it and who feels pressured by others to participate. It’s amazing how all this is happening at such a young age. So complex but simple.
I also agree that when questioning children, keeping it simple is best. We have lots to play with, so it can be distracting. If I refer to a book we read the previous day, I’ll keep my questions to a maximum of 4 to 5. Sometimes the three-year-old will surprise you with wonder detailed conversations.
Thank you for sharing.
Hi Jan,
Thank you for a great read. Your article has greatly helped solidify and improve my current program evaluation design where I am evaluating the effectiveness of a preschool reading circle program.
Originally, I had planned on using undergraduate students, many of whom are studying to become teachers, to conduct some of the surveys. But I understand that they would not have prior experience working specifically with students at a preschool age, which may limit their understanding of expectations of students at this age. I will therefore take your advice and have the preschool teachers be the ones conducting the surveys.
It is nice to see that I had planned on surveys being conducted either as one-on-one or small group activities, and this is indeed best practice. I had also planned on keeping my questions simple and straightforward. I was quite surprised that preschoolers and kindergarteners could handle up to 12 questions. Seemed like a lot to me! I could possibly therefore re-think adding a few more questions to my survey.
I also like the idea of using observation in structured situations, such as a play scenario. Would you also advise preschool teachers being the ones to observe these interactions? Or could an outsider also observe and generate reasonable results?
From my own experience working with a wide age range of students, I appreciate being able to find an article that specifically deals with preschool children, as most of the information deals with school-age children. There is quite the difference between a preschooler and a first grader after all! Thanks again.
Hi, Hanna,
I apologize for the very late response. 2018 had me sidelined for most of the year with shoulder and knee surgeries, and then, sadly, I completely spaced on following comments on this post.
There is indeed a difference between a preschooler and a grade school child, even first grade. In terms of data collectors for surveys as well as observation, I certainly recommend using adults who are used to working with young children. Professionals already in the field are great. You can also look to early childhood education training programs for undergraduates who have already completed their field placements. The key is having someone who understands preschoolers and the way they think and interact. I have had cases where I had to use adults familiar with kids but not necessarily very young ones. When that is the case, I’ve found it very helpful to keep my observation instrument as open as possible with little to no coding actually done in the field. Instead, I use a modified form of scripting onto an observation instrument that has columns for categories of behavior. A time note is made in the left-hand column, then running notes in the appropriate column for what is observed. Out of the field, you can then have all observation instruments coded by one or two people who have the specific expertise to interpret.
Hope this helps. Feel free to come back with questions you may have. My website is http://www.pathfinderevaluation.com. There is a contact page on the site that will come straight to my email.
Hello Ms. Noga,
Thank you for your thoughtful article regarding conducting evaluations with young children. I found both of your articles to have helpful tips for obtaining credible and usable data when working children.
I am a Kindergarten teacher, and currently a Graduate student at Queen’s university. For one of my course projects, I am evaluating a Forest School Program offered within my elementary school. I am trying to determine how children, ages 3-6 are feeling about the program. After reading your articles, I have realized that my questions may have been too open ended and moving forward I will choose more specific questions. I will also ensure that the language I use is developmentally appropriate as well as limiting the questions to a maximum of 12, as you suggest. I like the ideas of using surveys with stickers as this would provide more streamlined responses.
I agree with your conclusions about ensuring that evaluators have sufficient experience working with young children. I too, have found that when I have volunteers in my classroom who do not have experience with young children, there are often misunderstandings about language, intent or behaviour. It would then be difficult to ensure responses are accurate.
As per my current evaluation project, I was wondering if you see value in using photographs to document child thinking, perspective and experience. I often take pictures of children engaged in activities and them have them discuss it at a later date. What are your thoughts on this? Would it yield usable data?
Thank you again. I look forward to your response.
Tracey
Hi, Tracey,
I apologize for the very delayed response. I tore up a shoulder and knee and spent most of 2018 either recovering or in physical therapy. I’ve just started to hit my stride and get back into things.
In terms of using photographs, a lot depends on the level of consent required and your intended use of the pictures. The use you describe is certainly valid and shouldn’t be a concern. I love the idea of doing that as it’s a great way to help them remember back more accurately and to tie into not just what they were doing but how they felt about it. There’s tremendous power in visual imagery that gets missed in research and evaluation.
Good luck. If you have further questions, my website is http://www.pathfinderevaluation.com and has a contact page that will go directly to my email.
Hello Jan. Excellent hot tips. Thank you for sharing. What kind of questions do you ask them? Do you find you get better results one on one than in small groups? Or does it depend on the class setting and children? Do you ask them the questions more than once in a year? I’m curious if the answers would change based on their comfort level in September and then again in June also keeping in mind their maturity has increased. What kind of information are you looking for as a developmental counsellor and psychologist? Do the parents need to consent before you do your surveys? Thanks in advance.
Hi, Teri,
I’m so, so sorry about the delayed response. I’ve been battling shoulder and knee issues that resulted in several surgeries. 2018 in particular was basically a wash for me. Sadly, by the time I was up and around, I had completely forgotten about checking for comments.
Happy to address your questions:
Kinds of questions: Very concrete questions that are tightly focused. My favorite is to make a statement and have the kids use a colored sticker to indicate how this statement describes them. For example: “I like to come to school.” I would then say “If you like to come to school all or most of the time, put a blue sticker on [whatever you are using for responses]. If you don’t like to come to school at all, put an orange sticker. If sometimes you like to come and sometimes you don’t, put a green sticker.” I would be happy to provide a sample of the questions I use – if you use the contact page on my website (www.pathfinderevaluation.com), questions will come directly to my email and I can get something back to you.
Grouping: In general, small groups of 4-5 work fine. However, there will be the occasional child who patterns (lays out a row of a single color or makes patterns with the colors instead of actually answering), copies a neighbor’s responses, or just doesn’t understand. In that case, I will likely take the child alone after I’ve finished the whole group. I never imply that they did anything wrong. For instance, in the case where a child copies a neighbor, I might say – it looks like you were having trouble making up your mind about some questions, would you like to play the game again? With first through fifth graders, I try to keep groups to a max of 6. For kids beyond that age, I find that they can easily do this as a whole class activity with the questions projected and them bubbling or circling their responses.
Frequency: I wouldn’t do this more often than twice in a school year, but doing it twice is a great way to get some idea of how they change over the course of the year. If you are doing it twice – late September/early October and late March/early April are ideal. Enough time in the fall for them to be comfortable with their current classroom; a time in the spring that doesn’t have too many holidays with the exception of spring break (make sure you don’t do this within a week either side of break). This is based on midwestern US school calendars with a mid to late August start and a mid-May end. You can adjust accordingly.
What am I looking for?: When I do these, I’m particularly interested in children’s degree of comfort and affection for school, their sense of belonging, attitudes about their teacher, and sense of efficacy in terms of being able to do the work required of them.
Parental consent: This really, really depends on your role, your goals for doing this, and the intended use of the information. If you are a teacher, this falls quite legitimately in the zone of student assessment. As long as you are using the information to inform your understanding of your students and your teaching practice, you’re good. Still a classroom teacher but planning to write this up for a practice journal, as long as you keep the focus on assessment and what you are learning and keep your findings anonymized to the greatest extent that you can, you will be okay. An evaluator or researcher who is external to the school and/or classroom? You absolutely need parental consent. In the past, we were allowed to do this via negative consent – consent was implied unless a parent objected (more in a bit on this). Now, we are required to get positive consent – every parent/guardian has to consent. If you are an evaluator doing this under contract to a school district, they will usually help with this. A note on refusal: I usually contact any parent who refuses to better understand their questions and concerns. I’ve had it happen, but in every case, once I’ve explained the nature of the activity, they have been okay with their child “playing the game” as long as I don’t use the data. I then pull those cards while collecting them after each group, hold them aside and shred them back at my office.
Again, so sorry for the delay. If you have more questions or just want to bounce things off me, my website is http://www.pathfinderevaluation.com. There is a contact page on the site that will feed directly into my email.