File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Lexical processing in sign language: A visual mismatch negativity study

TitleLexical processing in sign language: A visual mismatch negativity study
Authors
KeywordsVisual mismatch negativity (vMMN)
Deaf signers
Lexical processing
Hong Kong
Sign Language (HKSL)
Issue Date2020
PublisherPergamon. The Journal's web site is located at http://www.elsevier.com/locate/neuropsychologia
Citation
Neuropsychologia, 2020, v. 148, p. article no. 107629 How to Cite?
AbstractEvent-related potential studies of spoken and written language show the automatic access of auditory and visual words, as indexed by mismatch negativity (MMN) or visual MMN (vMMN). The present study examined whether the same automatic lexical processing occurs in a visual-gestural language, i.e., Hong Kong Sign Language (HKSL). Using a classic visual oddball paradigm, deaf signers and hearing non-signers were presented with a sequence of static images representing HKSL lexical signs and non-signs. When compared with hearing non-signers, deaf signers exhibited an enhanced vMMN elicited by the lexical signs at around 230 ms, and a larger P1–N170 complex evoked by both lexical sign and non-sign standards at the parieto-occipital area in the early time window between 65 ms and 170 ms. These findings indicate that deaf signers implicitly process the lexical sign and that neural response differences between deaf signers and hearing non-signers occur at the early stage of sign processing.
Persistent Identifierhttp://hdl.handle.net/10722/289502
ISSN
2019 Impact Factor: 2.652
2015 SCImago Journal Rankings: 2.072

 

DC FieldValueLanguage
dc.contributor.authorDeng, Q-
dc.contributor.authorGu, F-
dc.contributor.authorTong, SX-
dc.date.accessioned2020-10-22T08:13:34Z-
dc.date.available2020-10-22T08:13:34Z-
dc.date.issued2020-
dc.identifier.citationNeuropsychologia, 2020, v. 148, p. article no. 107629-
dc.identifier.issn0028-3932-
dc.identifier.urihttp://hdl.handle.net/10722/289502-
dc.description.abstractEvent-related potential studies of spoken and written language show the automatic access of auditory and visual words, as indexed by mismatch negativity (MMN) or visual MMN (vMMN). The present study examined whether the same automatic lexical processing occurs in a visual-gestural language, i.e., Hong Kong Sign Language (HKSL). Using a classic visual oddball paradigm, deaf signers and hearing non-signers were presented with a sequence of static images representing HKSL lexical signs and non-signs. When compared with hearing non-signers, deaf signers exhibited an enhanced vMMN elicited by the lexical signs at around 230 ms, and a larger P1–N170 complex evoked by both lexical sign and non-sign standards at the parieto-occipital area in the early time window between 65 ms and 170 ms. These findings indicate that deaf signers implicitly process the lexical sign and that neural response differences between deaf signers and hearing non-signers occur at the early stage of sign processing.-
dc.languageeng-
dc.publisherPergamon. The Journal's web site is located at http://www.elsevier.com/locate/neuropsychologia-
dc.relation.ispartofNeuropsychologia-
dc.subjectVisual mismatch negativity (vMMN)-
dc.subjectDeaf signers-
dc.subjectLexical processing-
dc.subjectHong Kong-
dc.subjectSign Language (HKSL)-
dc.titleLexical processing in sign language: A visual mismatch negativity study-
dc.typeArticle-
dc.identifier.emailTong, SX: xltong@hku.hk-
dc.identifier.authorityTong, SX=rp01546-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.neuropsychologia.2020.107629-
dc.identifier.pmid32976852-
dc.identifier.scopuseid_2-s2.0-85091914671-
dc.identifier.hkuros316421-
dc.identifier.volume148-
dc.identifier.spagearticle no. 107629-
dc.identifier.epagearticle no. 107629-
dc.publisher.placeUnited Kingdom-
dc.identifier.issnl0028-3932-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats