Why is sign language a non-verbal communication

 

Sign language uses manual (hands and arms) and non-manual (head and torso) means of expression. There are different meanings of signs in different cultures. The finger alphabet serves as a supplement for terms that cannot be represented using sign language.

The means of expression in sign language

Sign language uses different means of expression than spoken language. While we perceive spoken language primarily acoustically, sign language uses manual and not manual means of expression. The manual means of expression include hands and arms, the non-manual means of expression include the facial expression, the gaze, the head, the upper body and the mouth (cf. Boyes Braem, 1990, p. 17).

When one imagines sign language, many people probably immediately think of the hands, but the non-manual means of expression are primarily responsible for conveying feelings and grammar (cf. Boyes Braem, 1990, p. 18).

Intercultural difference of signs

The aim of the following short story was to show that signs have different meanings in different regions of the world.

A signer from China meets a group of American people who also speak using sign language. He refers to himself by touching the tip of his nose with his index finger. The others are very surprised and confused. One asks him: "Why the nose?", Because for him that means weird, ugly or boring. He would never call himself that and shows him how he would express himself and touched his sternum with his right index finger. The Chinese then became confused because for him this means hatred, disgust or nausea (cf. Boyes Braem, 1990, p. 124).

The finger alphabet

The international finger alphabet is mainly used for terms for which there are no signs. It is used to spell personal names, technical terms, cities and organizations. The finger alphabet consists of different hand shapes that depict the letters of the alphabet. This alphabet can be used to spell words in the air when paper and pen are not available. The difference between finger alphabet and sign language is that in sign language a single movement corresponds to a word, while in finger alphabet the word is spelled. You should try to be spelled as well as possible, always in the same place, for example in front of the chest and always with the same hand position. There should be as little movement as possible. This means that the hand is only moved when changing from one letter to the next (cf. Boyes Braem, 1990, p. 146).

In addition, not only single words but also entire sentences can be written with the finger alphabet. The word sequence corresponds to the grammatical rules of the spoken language. This is not so common. People who communicate with sign language use their own grammar and sentence order. However, it has been shown that only 56% of the words spelled with the fingers are intelligible in a sentence. Furthermore, communication with this alphabet over a long period of time is exhausting and tiring. In order to avoid misunderstandings between the different cultures, as we have already mentioned, the finger alphabet is widely used between deaf people from different countries. However, these can differ from one another. In the British alphabet, for example, both hands are used (see Boyes Braem, 1990, p. 146f).

Cochlear implants and sign language


It is often advised not to react to gestures in children with cochlear implants, but to insist that the child say what they want to say out loud. But if there is no response to a gesture, one refuses to respond to a communication offer, i.e. In other words, the child understands this as a rejection and will restrict his communication and thus also his learning processes. However, after many years of research, Gisela Szagun (2012) advises that the gestural statement of the child be reproduced in words and answered with gestural support with words. If a child is hearing-impaired and cannot hear enough despite the cochlear implant, it should learn sign language as an alternative or in addition, because the development of a functional system of symbols is independent of whether it is possible to use a voice or a sign. Up until now, it has always been feared that communicating with signs impaired the acquisition of spoken language, but this hypothesis cannot be supported by empirical facts; rather, it can be disadvantageous to withhold visually transmitted information from children with hearing impairments. This is because a spoken conversation is accompanied to a greater or lesser extent by gestures. People also look at each other and thus perceive mouth movements when speaking, even if this is not done consciously. All of this helps people to convey the spoken message, and it also helps people with normal hearing to acquire the language. Therefore, the visual information - be it in the form of a face image when speaking or in the form of gestures and signs - should not be withheld from children with cochlear implants, because it would make language acquisition more difficult for them compared to children with normal hearing. By taking part in the movements, even if they are hidden, children learn the sound, because the muscle movements give the brain information about sound formation. This kinesthetic information is stored by the brain together with the sound information and the formation of the sound is gradually learned. In child development, it is helpful when learning to have information from sensory impressions and movement, so that it is therefore extremely helpful if children also orientate themselves to the mouth image when acquiring language ...


The Broca area as the headquarters also for sign language

Cognitive research on sign languages ​​since the 1960s has shown that sign languages ​​are fully fledged autonomous languages ​​and have a complex organization on several linguistic levels such as grammar and meaning. Studies on the processing of sign language in the human brain had already found some similarities but also differences between sign and spoken languages, but it is difficult to derive a uniform picture of the processing of both forms of language in the brain.

Over seventy million deaf people worldwide use one of the more than two hundred sign languages, and although these have access to structures in the brain similar to spoken languages, it has not yet been possible to identify those brain regions that use both forms of language. So up to now it was not known which brain regions are actually involved in the processing of sign language or how large the overlap is with brain regions that listeners use for spoken language processing.

Trettenbrein et al. (2020) recently discovered in a meta-analysis that the Broca area in the left hemisphere is obviously the central node both in signed and spoken form, which shows that the brain is generally geared towards the grammar and meaning of language process, regardless of whether it is heard or or seen. For the first time, it was possible to identify in a statistically robust manner the areas that are involved in the processing of sign language across all studies. The Broca area in the frontal lobe of the left hemisphere was one of the regions involved in the processing of sign language in almost every of the studies evaluated, whereby this area plays a central role in spoken language and is used there for grammar and meaning . It was also possible to show the role of the right frontal lobe, the counterpart to Broca's area on the left, because this also occurred again and again in many of the evaluated studies on sign language because there were non-linguistic aspects such as spatial or social information processed by the counterpart. This means that movements of the hands, face and body, which are made up of gestures, are perceived by the deaf and hearing in principle in a similar way, but only in the case of the deaf do they additionally activate the speech network in the left hemisphere, including the Broca area.
Deaf people perceive gestures as signs with linguistic content instead of pure movement sequences, as is the case with hearing people. One concludes from this that the brain specializes in language per se, but not in speaking

literature

Boyes Braem, P. (1990). Introduction to Sign Language and its exploration. Hamburg: Signum publishing house.

Donath, P., Hase, U., Prillwitz, S. & Wempe, K. (1996). A minority make themselves heard. Hamburg: Signum publishing house.

Szagun, Gisela (2012). Ways to language - guide to language acquisition in children with cochlear implants. Pabst.


Trettenbrein, P. C., Papitto, G., Friederici, A. D., & Zaccarella, E. (2020). The functional neuroanatomy of language without speech: An ALE meta-analysis of sign language. Human Brain Mapping, 42, 699-712.

 

Overview: what is non-verbal communication?

Was this information useful for you? Then please click on that



In the worksheets


This work is licensed under a Creative Commons License.