Archive by Author

Where did the kids go? Exploring Interaction at a Science Museum Exhibit

A post by Naomi Thompson

Hello! My name is Naomi Thompson and I’m entering my 3rd year in the Indiana University Learning Sciences program. Dr. Kylie Peppler is my advisor. I’m interested in intersections between art/making/play and STEM, especially in informal spaces. Lately, I’ve become increasingly interested in museums, such as science centers, as really interesting places for these intersections to thrive. Science centers and museums especially can provide many opportunities to redefine what comprises STEM, and provide young children with powerful hands-on experiences. Museum exhibits for children are generally designed to provide whole-body, interactive learning experiences. Observing these interactions can provide meaningful insight into how children engage and learn across various kinds of settings.

Where did the kids go? Exploring Interaction at a Science Museum Exhibit

A Child Playing with Water Works at Wonderlab, photo by Chris Higgins

For a graduate seminar titled Knowing, Designing, and Learning with Dr. Sean Duncan, I got a chance to spend a few hours observing The WonderLab Museum of Science, Health, and Technology in Bloomington, IN. It’s a popular space where children and their families often spend considerable amounts of time playing with and learning about a wide array of scientific concepts. I found myself hovering around the Water Works water table, fascinated by the children’s play with water and toys there. This exhibit is part of various clusters of activities through the museum meant to uncover “how things work.”  It entailed a large, cylindrical container that was partially raised above the surface of the shallow pool in the table, I called this the vortex. Water inside was constantly spinning in a downward spiral, sucking along anything small that was placed inside, and sending the objects back out into the general pool. The majority of the visitors playing were small children, mostly younger than 5 years old. As young children often do when they learn about something new and exciting, they tended to repeat their actions, picking up brightly colored plastic balls and dropping them into a water vortex over and over, fascinated by the results each time. Alternatively, the children could toddle to the far end of the table, which was full of pipes, fountains, and faucets, and send the balls down a ramp that allowed them to be shot into the air by a strong fountain. The balls would then roll down a guide wire and plop back into the water vortex. These children seemed to be having so much fun, I wish I had gotten up to play with them. I was curious to see what might happen as this activity evolved over time.

During my second day of observation, just about all the brightly colored plastic balls had gotten stuck at the top of the fountain, unable to roll down the wire into the water vortex. One boy with brown hair, a little older than some of the others, sent the last ball up, apparently to see if it could jog the others loose. He watched to see what would happen, and when this ball got stuck as well, he played with other parts of Water Works briefly before walking away. By this time, very few children were left at the table – only three slightly older children compared to the usual six to eight young ones – and parents had noticed that all the balls were stuck. Some parents around tried using sticks and toys to knock them down, and briefly saved a few that promptly got stuck again. Finally, a parent went to get a staff member who used a broom to knock them all down. Moments later, almost without my noticing, there were six young children playing at the table again with the colored balls circulating nicely along their paths.

It was really interesting to witness what changed with children’s interactions, when a breakdown in the exhibit occurred. Whether it was a temporary flaw, or it was purposely created to occasionally “break,” there are interesting implications here for what engages young children, and what learning opportunities exist when things break. When do children try to fix the problem, and when do they decide to move on to something else? It would be really interesting to study these moments of accidental and on purpose “breaking,” and look into how different children in different settings respond to these setbacks. Is there something about informal settings that lets kids feel like they can try a new activity if something isn’t working for them? Or since the stakes are lower, do they feel more able to work through problems if they want? What roles do adults play in these situations? I’d be interested to hear ideas and suggestions from others about this line of thought!

Where did the kids go? Exploring Interaction at a Science Museum Exhibit

A post by Naomi Thompson

Hello! My name is Naomi Thompson and I’m entering my 3rd year in the Indiana University Learning Sciences program. Dr. Kylie Peppler is my advisor. I’m interested in intersections between art/making/play and STEM, especially in informal spaces. Lately, I’ve become increasingly interested in museums, such as science centers, as really interesting places for these intersections to thrive. Science centers and museums especially can provide many opportunities to redefine what comprises STEM, and provide young children with powerful hands-on experiences. Museum exhibits for children are generally designed to provide whole-body, interactive learning experiences. Observing these interactions can provide meaningful insight into how children engage and learn across various kinds of settings.

Where did the kids go? Exploring Interaction at a Science Museum Exhibit

A Child Playing with Water Works at Wonderlab, photo by Chris Higgins

For a graduate seminar titled Knowing, Designing, and Learning with Dr. Sean Duncan, I got a chance to spend a few hours observing The WonderLab Museum of Science, Health, and Technology in Bloomington, IN. It’s a popular space where children and their families often spend considerable amounts of time playing with and learning about a wide array of scientific concepts. I found myself hovering around the Water Works water table, fascinated by the children’s play with water and toys there. This exhibit is part of various clusters of activities through the museum meant to uncover “how things work.”  It entailed a large, cylindrical container that was partially raised above the surface of the shallow pool in the table, I called this the vortex. Water inside was constantly spinning in a downward spiral, sucking along anything small that was placed inside, and sending the objects back out into the general pool. The majority of the visitors playing were small children, mostly younger than 5 years old. As young children often do when they learn about something new and exciting, they tended to repeat their actions, picking up brightly colored plastic balls and dropping them into a water vortex over and over, fascinated by the results each time. Alternatively, the children could toddle to the far end of the table, which was full of pipes, fountains, and faucets, and send the balls down a ramp that allowed them to be shot into the air by a strong fountain. The balls would then roll down a guide wire and plop back into the water vortex. These children seemed to be having so much fun, I wish I had gotten up to play with them. I was curious to see what might happen as this activity evolved over time.

During my second day of observation, just about all the brightly colored plastic balls had gotten stuck at the top of the fountain, unable to roll down the wire into the water vortex. One boy with brown hair, a little older than some of the others, sent the last ball up, apparently to see if it could jog the others loose. He watched to see what would happen, and when this ball got stuck as well, he played with other parts of Water Works briefly before walking away. By this time, very few children were left at the table – only three slightly older children compared to the usual six to eight young ones – and parents had noticed that all the balls were stuck. Some parents around tried using sticks and toys to knock them down, and briefly saved a few that promptly got stuck again. Finally, a parent went to get a staff member who used a broom to knock them all down. Moments later, almost without my noticing, there were six young children playing at the table again with the colored balls circulating nicely along their paths.

It was really interesting to witness what changed with children’s interactions, when a breakdown in the exhibit occurred. Whether it was a temporary flaw, or it was purposely created to occasionally “break,” there are interesting implications here for what engages young children, and what learning opportunities exist when things break. When do children try to fix the problem, and when do they decide to move on to something else? It would be really interesting to study these moments of accidental and on purpose “breaking,” and look into how different children in different settings respond to these setbacks. Is there something about informal settings that lets kids feel like they can try a new activity if something isn’t working for them? Or since the stakes are lower, do they feel more able to work through problems if they want? What roles do adults play in these situations? I’d be interested to hear ideas and suggestions from others about this line of thought!

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

A post by Alejandro Andrade

After having the opportunity to explore the use of a text mining approach to analyze information in a design-based research project about using video to support pre-service teachers’ ability to notice (Van Es & Sherin, 2002), I have three major ideas to share. First, these data-mining techniques are flexible and powerful tools, and yet one should be aware of several of their limitations. For instance, the stemmed words in a text document are but proxies of participants’ conceptual engagement, but these might be a rather distal than proximal type of evidence. The bag-of-words approach, the one used in my analysis, overlooks a great deal of information that might have been relevant to help tease apart more nuanced hypotheses. Nonetheless, the approach, however distal it might have been, did provide relevant evidence given the context of the present study, for instance, the relationships between the learning theories and the student analytics, and these latter and the experts’ analytics.

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

Second, while the bag-of-words is one text mining approach, it is not the only computerized technique available. Indeed, other more powerful tools can supplement or replace such an approach. For instance, some computerized linguistic analyses exist that can provide measures of coherence and cohesion in text documents. One of such techniques is the Coh-Metrix (Graesser, McNamara, Louwerse, & Cai, 2004), a free online tool that provides more than a hundred different indices with text characteristics. Among others, Coh-Metrix provides information about text easability, referential cohesion, content word overlap, connective incidence, passivity, causal verbs and causal particles, intentional particles, temporal cohesion, etc. With this tool one can supplement the findings about differences and similarities between the students’ and experts’ analytics, for instance.

Third, I believe that the incorporation of computational techniques to the researcher’s toolkit is bound to gain traction in the learning sciences. In particular, as researchers adopt design-based research methodologies (Cobb, Confrey, Disessa, Lehrer, & Schauble, 2003; W. Sandoval, 2014; W. A. Sandoval & Bell, 2004) that demand a sequence of test and refine iterations, they might consider having tools in their belts that can allow swift and reliable understanding of their results. Unlike other slow qualitative coding-scheme-based approaches that require inter-rater reliability, content analytic tools such as the bag-of-words are much faster and consistent. Also, these quantitative tools are useful with only a small group of students or with larger samples of various hundreds or even several thousands of participants. This doesn’t mean that traditional coding schemes are not good, or that we should stop caring about them. On the contrary, I believe that both approaches can work in tandem, where computational techniques provide a first glance at the data for a quick and dirty pass of analysis that can inform the research team on how to adapt and refine the design, and then, when resources allow, researchers can go deep into the data and examine the nuances of student learning interactions.

References

Cobb, P., Confrey, J., Disessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9.

Graesser, A. C., McNamara, D. S., Louwerse, M. M., & Cai, Z. (2004). Coh-Metrix: Analysis of text on cohesion and language. Behavior Research Methods, Instruments, & Computers, 36(2), 193-202.

Sandoval, W. (2014). Conjecture mapping: an approach to systematic educational design research. Journal of the Learning Sciences, 23(1), 18-36.

Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199-201.

Van Es, E. A., & Sherin, M. G. (2002). Learning to Notice: Scaffolding New Teachers’ Interpretations of Classroom Interactions. Journal of Technology and Teacher Education, 10(4), 571-596.

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

A post by Alejandro Andrade

After having the opportunity to explore the use of a text mining approach to analyze information in a design-based research project about using video to support pre-service teachers’ ability to notice (Van Es & Sherin, 2002), I have three major ideas to share. First, these data-mining techniques are flexible and powerful tools, and yet one should be aware of several of their limitations. For instance, the stemmed words in a text document are but proxies of participants’ conceptual engagement, but these might be a rather distal than proximal type of evidence. The bag-of-words approach, the one used in my analysis, overlooks a great deal of information that might have been relevant to help tease apart more nuanced hypotheses. Nonetheless, the approach, however distal it might have been, did provide relevant evidence given the context of the present study, for instance, the relationships between the learning theories and the student analytics, and these latter and the experts’ analytics.

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

Brief Reflection about Using a Text Mining Approach in a Design-Based Research

Second, while the bag-of-words is one text mining approach, it is not the only computerized technique available. Indeed, other more powerful tools can supplement or replace such an approach. For instance, some computerized linguistic analyses exist that can provide measures of coherence and cohesion in text documents. One of such techniques is the Coh-Metrix (Graesser, McNamara, Louwerse, & Cai, 2004), a free online tool that provides more than a hundred different indices with text characteristics. Among others, Coh-Metrix provides information about text easability, referential cohesion, content word overlap, connective incidence, passivity, causal verbs and causal particles, intentional particles, temporal cohesion, etc. With this tool one can supplement the findings about differences and similarities between the students’ and experts’ analytics, for instance.

Third, I believe that the incorporation of computational techniques to the researcher’s toolkit is bound to gain traction in the learning sciences. In particular, as researchers adopt design-based research methodologies (Cobb, Confrey, Disessa, Lehrer, & Schauble, 2003; W. Sandoval, 2014; W. A. Sandoval & Bell, 2004) that demand a sequence of test and refine iterations, they might consider having tools in their belts that can allow swift and reliable understanding of their results. Unlike other slow qualitative coding-scheme-based approaches that require inter-rater reliability, content analytic tools such as the bag-of-words are much faster and consistent. Also, these quantitative tools are useful with only a small group of students or with larger samples of various hundreds or even several thousands of participants. This doesn’t mean that traditional coding schemes are not good, or that we should stop caring about them. On the contrary, I believe that both approaches can work in tandem, where computational techniques provide a first glance at the data for a quick and dirty pass of analysis that can inform the research team on how to adapt and refine the design, and then, when resources allow, researchers can go deep into the data and examine the nuances of student learning interactions.

References

Cobb, P., Confrey, J., Disessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9.

Graesser, A. C., McNamara, D. S., Louwerse, M. M., & Cai, Z. (2004). Coh-Metrix: Analysis of text on cohesion and language. Behavior Research Methods, Instruments, & Computers, 36(2), 193-202.

Sandoval, W. (2014). Conjecture mapping: an approach to systematic educational design research. Journal of the Learning Sciences, 23(1), 18-36.

Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199-201.

Van Es, E. A., & Sherin, M. G. (2002). Learning to Notice: Scaffolding New Teachers’ Interpretations of Classroom Interactions. Journal of Technology and Teacher Education, 10(4), 571-596.