File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3340555.3353726
- Scopus: eid_2-s2.0-85074951409
- WOS: WOS:000518657800016
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Dynamic adaptive gesturing predicts domain expertise in mathematics
| Title | Dynamic adaptive gesturing predicts domain expertise in mathematics |
|---|---|
| Authors | |
| Keywords | Domain expertise Gestures Iconic gestures Mathematics Multimodal learning analytics Prediction of cognitive state Quality of movements |
| Issue Date | 2019 |
| Citation | ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction, 2019, p. 105-113 How to Cite? |
| Abstract | Embodied Cognition theorists believe that mathematics thinking is embodied in physical activity, like gesturing while explaining math solutions. This research asks the question whether expertise in mathematics can be detected by analyzing students' rate and type of manual gestures. The results reveal several unique findings, including that math experts reduced their total rate of gesturing by 50%, compared with non-experts. They also dynamically increased their rate of gesturing on harder problems. Although experts reduced their rate of gesturing overall, they selectively produced 62% more iconic gestures. Iconic gestures are strategic because they assist with retaining spatial information in working memory, so that inferences can be extracted to support correct problem solving. The present results on representation-level gesture patterns are convergent with recent findings on signal-level handwriting, while also contributing a causal understanding of how and why experts adapt their manual activity during problem solving. |
| Persistent Identifier | http://hdl.handle.net/10722/354139 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Sriramulu, Abishek | - |
| dc.contributor.author | Lin, Jionghao | - |
| dc.contributor.author | Oviatt, Sharon | - |
| dc.date.accessioned | 2025-02-07T08:46:42Z | - |
| dc.date.available | 2025-02-07T08:46:42Z | - |
| dc.date.issued | 2019 | - |
| dc.identifier.citation | ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction, 2019, p. 105-113 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/354139 | - |
| dc.description.abstract | Embodied Cognition theorists believe that mathematics thinking is embodied in physical activity, like gesturing while explaining math solutions. This research asks the question whether expertise in mathematics can be detected by analyzing students' rate and type of manual gestures. The results reveal several unique findings, including that math experts reduced their total rate of gesturing by 50%, compared with non-experts. They also dynamically increased their rate of gesturing on harder problems. Although experts reduced their rate of gesturing overall, they selectively produced 62% more iconic gestures. Iconic gestures are strategic because they assist with retaining spatial information in working memory, so that inferences can be extracted to support correct problem solving. The present results on representation-level gesture patterns are convergent with recent findings on signal-level handwriting, while also contributing a causal understanding of how and why experts adapt their manual activity during problem solving. | - |
| dc.language | eng | - |
| dc.relation.ispartof | ICMI 2019 - Proceedings of the 2019 International Conference on Multimodal Interaction | - |
| dc.subject | Domain expertise | - |
| dc.subject | Gestures | - |
| dc.subject | Iconic gestures | - |
| dc.subject | Mathematics | - |
| dc.subject | Multimodal learning analytics | - |
| dc.subject | Prediction of cognitive state | - |
| dc.subject | Quality of movements | - |
| dc.title | Dynamic adaptive gesturing predicts domain expertise in mathematics | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1145/3340555.3353726 | - |
| dc.identifier.scopus | eid_2-s2.0-85074951409 | - |
| dc.identifier.spage | 105 | - |
| dc.identifier.epage | 113 | - |
| dc.identifier.isi | WOS:000518657800016 | - |
