[go: nahoru, domu]

Gesture: Difference between revisions

Content deleted Content added
m Journal cites:, templated 1 journal cites
Added 2 dois to journal cites
Line 10:
A '''gesture''' is a form of [[non-verbal communication]] or non-vocal [[communication]] in which visible bodily actions communicate particular messages, either in place of, or in conjunction with, [[speech]]. Gestures include movement of the [[hand]]s, [[face]], or other parts of the [[Human body|body]]. Gestures differ from physical non-verbal communication that does not communicate specific messages, such as purely [[Emotional expression|expressive]] displays, [[proxemics]], or displays of [[joint attention]].<ref name=Kendon>Kendon, Adam. (2004) ''Gesture: Visible Action as Utterance''. Cambridge: Cambridge University Press. {{ISBN|0-521-83525-9}}</ref> Gestures allow individuals to communicate a variety of feelings and thoughts, from contempt and hostility to approval and affection, often together with [[body language]] in addition to [[word]]s when they speak. Gesticulation and speech work independently of each other, but join to provide emphasis and meaning.
 
Gesture processing takes place in areas of the brain such as [[Broca's area|Broca's]] and [[Wernicke's area]]s, which are used by [[speech]] and [[sign language]].<ref name="Xu">{{cite journal | last1 = Xu | first1 = J | last2 = Gannon | first2 = PJ | last3 = Emmorey | first3 = K | last4 = Smith | first4 = JF | last5 = Braun | first5 = AR | year = 2009 | title = Symbolic gestures and spoken language are processed by a common neural system | url = https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2779203/pdf/pnas.0909197106.pdf | format = PDF | journal = Proc Natl Acad Sci U S A. | volume = 106 | issue = | pages = 20664–20669 | doi = 10.1073/pnas.0909197106 | pmid = 19923436 | pmc = 2779203 }}</ref> In fact, language is thought by some scholars to have evolved in ''Homo sapiens'' from an earlier system consisting of manual gestures.<ref>{{cite journal|last=Corballis|first=Michael|title=The gestural origins of language|journal=WIREs Cognitive Science|date=January–February 2010|volume=1|issue=1|pages=2–7|doi=10.1002/wcs.2|pmid=26272832}}</ref> The theory that language evolved from manual gestures, termed [[Origin of language#Gestural theory|Gestural Theory]], dates back to the work of 18th-century philosopher and priest [[Étienne Bonnot de Condillac|Abbé de Condillac]], and has been revived by contemporary anthropologist Gordon W. Hewes, in 1973, as part of a discussion on the [[origin of language]].<ref>{{cite journal | last1 = Corballis | first1 = Michael | year = 2010 | title = "The gestural origins of language." © 2009 John Wiley & Sons, Ltd | url = | journal = WIREs Cogn Sci | volume = 1 | issue = | pages = 2–7 | doi = 10.1002/wcs.2 }}</ref>
 
== Research throughout the ages ==
Line 124:
The linkage of hand and body gestures in conjunction with speech is further revealed by the nature of gesture use in blind individuals during conversation. This phenomenon uncovers a function of gesture that goes beyond portraying communicative content of language and extends [[David McNeill]]'s view of the gesture-speech system. This suggests that gesture and speech work tightly together, and a disruption of one (speech or gesture) will cause a problem in the other. Studies have found strong evidence that speech and gesture are innately linked in the brain and work in an efficiently wired and choreographed system. McNeill's view of this linkage in the brain is just one of three currently up for debate; the others declaring gesture to be a "support system" of spoken language or a physical mechanism for lexical retrieval.<ref>{{cite journal|last=Iverson |first=Jana M. |author2=Esther Thelen |title=Hand, Mouth and Brain |journal=Journal of Consciousness Studies |year=2005 |url=http://cspeech.ucd.ie/~fred/docs/IversonThelen.pdf |accessdate=1 October 2013 |url-status=dead |archiveurl=https://web.archive.org/web/20131004215359/http://cspeech.ucd.ie/~fred/docs/IversonThelen.pdf |archivedate=4 October 2013 }}</ref>
 
Because of this connection of co-speech gestures—a form of manual action—in language in the brain, Roel Willems and Peter Hagoort conclude that both gestures and language contribute to the understanding and decoding of a speaker's encoded message. Willems and Hagoort's research suggest that "processing evoked by gestures is qualitatively similar to that of words at the level of semantic processing." This conclusion is supported through findings from experiments by Skipper where the use of gestures led to "a division of labor between areas related to language or action (Broca's area and premotor/primary motor cortex respectively)", The use of gestures in combination with speech allowed the brain to decrease the need for "semantic control", Because gestures aided in understanding the relayed message, there was not as great a need for semantic selection or control that would otherwise be required of the listener through [[Broca's area]]. Gestures are a way to represent the thoughts of an individual, which are prompted in working memory. The results of an experiment revealed that adults have increased accuracy when they used pointing gestures as opposed to simply counting in their heads (without the use of pointing gestures)<ref name="VASC, Dermina 2013"/> Furthermore, the results of a study conducted by Marstaller and Burianová suggest that the use of gestures affect working memory. The researchers found that those with low capacity of working memory who were able to use gestures actually recalled more terms than those with low capacity who were not able to use gestures.<ref>{{cite journal | last1 = Marstaller | first1 = Lars | last2 = Burianová | first2 = Hana | year = 2013 | title = Individual differences in the gesture effect on working memory | url = | journal = Psychonomic Society | volume = 20 | issue = | pages = 496–500 | doi = 10.3758/s13423-012-0365-0 }}</ref>
 
Although there is an obvious connection in the aid of gestures in understanding a message, "the understanding of gestures is not the same as understanding spoken language." These two functions work together and gestures help facilitate understanding, but they only "partly drive the neural language system".<ref>{{cite journal | last1 = Willems | first1 = Roel M. | last2 = Hagoort | first2 = Peter | year = 2007 | title = Neural Evidence for the Interplay between Language, Gesture, and Action: A Review | url = | journal = Brain and Language | volume = 101 | issue = 3| pages = 14–6 | doi=10.1016/j.bandl.2007.03.004| pmid = 17416411 | hdl = 11858/00-001M-0000-0013-198D-E }}</ref>