Friday, December 28, 2007

VISUAL COMMUNICATION AS A PRIMARY SYSTEM, Sandra Moriarty

Journal of Visual Literacy 14:2 (1994): 11-21


VISUAL COMMUNICATION AS A PRIMARY SYSTEM
In a review of the contributions of various philosophers to semiotics, Thomas Sebeok, founder and director of the Center for Language and Semiotic Studies at Indiana University observed that Bacon, for one, "did not commit the vulgar error of identifying language with communication." (1991, p. 71) In other words, in Sebeok's view, language is only one type of communication and it is narrow-minded, if not vulgar, to privilege language over other types of sign systems.

On the other hand, equally respected Italian semiotician, Umberto Eco, declares that, "general semiotics studies the whole of the human signifying activity--languages--and languages are what constitutes human beings as such, that is, as semiotic animals." (1986, p. 12) Either Eco defines language more broadly than Sebeok or he is declaring, in opposition to Sebeok, that language is not only the most important sign system but that it encompasses the whole of human signification.

My aim here is not to deny the importance of verbal language, for clearly the two communication systems--visual and verbal-- are interdependent. Instead, I wish to emphasize the need for the development of theories of visual communication that parallel the emphasis historically place on language-based communication and to present an initial theory of visual communication as a primary form of communication different from but equally as important as language-based communication.

The Debate in Perspective

Sebeok's comments reflect the view of the semiotics field whose leading thinker was turn-of-the century American philosopher C.S. Peirce. Peircian semiotics with its roots in philosophy is concerned with the study of signs and considers sign systems to be much broader than language. Peirce developed a logic-centered orientation grounded in empirical observation.

The continental approach, called semiology, was formulated largely by Swiss linguist Ferdinand de Saussure during the early years of the 20th century. Semiology is rooted in a study of language and the two-part sign relationship between a signifier and its signified. Saussure who, although he admits that signs can be other than words, focuses most of his attention on how meaning is created through words. His work as well as that of his followers largely concentrates on linguistic based theories and forms of analysis.

In contrast, Peirce's tripartite system for analyzing signs includes iconic, indexical and symbolic categories of meaning, which provide a much richer field for visual analysis. Symbols are arbitrary and meaning is agreed to through convention; icons and indexes are "motivated," that is they are more likely to resemble their object in some way. Peirce defines an icon as similar to its subject--a representation such as a drawing or photograph where likeness or resemblance is a determining characteristic. An index is physically connected with its object as an indication that something exists or has occurred, such as a footprint means someone walked by or smoke means there is a fire.

Given the broader view of semiotics, it is no wonder that visual communication is more likely to find a home with Peirce than with Saussure. While "visual semiotics" has an intuitive logic, one can only wonder if the term "visual semiology" as used by Metz (1980, p. 63) is a contradiction in terms.

Semiologists see language, particularly verbal language, as the primary communication system and "distinguish sharply between intentional conventionalized devices (which they call signs) and other natural or unintentional manifestations which do not, strictly speaking, deserve such a name." (Eco, 1979, p. 15; Worth, p. 107) In this scheme, indexical signs, for example, are given little consideration. Worth agrees with the notion that intentionality is a requirement of communication. However, Eco describes the Peircean approach as "more comprehensive and fruitful" because as he explains, "It does not demand the qualities of being intentionally emitted and artificially produced." Furthermore, it broadens the concept of sign and introduces the idea of an interpretant, a concept that is extremely important in understanding the complexities of visual meaning.

Eco, nevertheless, as the opening quotes demonstrate, seems to posit a language-based semiology. One reason for what appears to be Eco's confusion is that he has tried in his own work to marry the Cartesian mentalism of the continental semiology with the pragmatics of post-Peircean semiotics. In spite of these integrative efforts, he still appears to privilege language and relegate non-linguistic sign systems to the periphery of semiotics. In his A Theory of Semiotics he describes his view of language-based semiotics in this way: (1979, p. 172),

"every theory of signification and communication has only one primary objective, i.e. verbal language, all other languages being imperfect approximations to its capacities and therefore constituting peripheral and impure instances of semiotic devices."

Turn-of-the century semiotics, however, is not the earliest or most definitive work on signs and their relationship to language, as Eco reminds us in his book, Semiotics and the Philosophy of Language. Many of these earlier approaches were not as wedded to language as the root of communication and thinking. Historically the classical Greeks, particularly Aristotle, were reluctant to equate signs and words, although in the semiotics of the Stoics, the theory of language finally becomes associated with the theory of signs. Augustine, however, fifteen centuries before Peirce, recognized the genus of signs of which linguistic signs were only one species. (Eco, 1986, p. 27, 31, 33) The debate about words and their relationship to other signs has been going on for a long time.

The Primary Modeling System

Questions about the primacy of language are of interest to scholars in visual communication because embedded in this discussion is a presumption of the role and importance of visual communication as secondary or peripheral. Peirce, and more recently Thomas Sebeok, founder of the Indiana University semiotics research center, present us with a theory of semiotics that is not only broader than a language-based sign system, it also treats both nonverbal and visual communication with the respect due to seminal communication systems.

But first let's consider the concept of a primary system. The Soviet and Eastern European school (Lotman, Zaliznjak, Ivanov, and Toporov among others), even though it bases its work on the concept of sign systems and is considered a school of semiotics rather than semiology, has done much to privilege language through its use of the phrase "primary modeling system." In Lotman's view, language is a primary modeling system through which "the other systems are expressed." (Eco, 1986 p. 32)

The concept of a primary modeling system, which Sebeok reviews in his book, A Sign is Just a Sign, has been central to the Soviet school since the early 1960s. In the Soviet view, secondary systems are large, global controlling operations like social, cultural, and ideological systems that organize behavior. At their base, providing an infrastructure for these larger systems, is natural based language which the Soviets call the primary modeling system. In other words, language is a primary system because it is the base on which are built more complex social/cultural systems such as myth and religion which are described as superstructures. (Sebeok 1991, p. 50)

The reason the Soviet theory may seem confusing is because the word "primary" can also suggest important, privileged, or first in development, as well as a base for other more complex systems, as the Soviets use it. If primary means first or most important, then secondary in a more common usage would naturally mean systems that are of lesser importance, rather than the global systems designated by the Soviets. In this use, other types of specialized systems dependent upon language, such as writing as well as myth and religion, what Sebeok called superstructures, would be referred to as secondary.

Eco provides an example of this terminological tangle. He explains the theory of primary modeling systems and, in the process, appears in his interpretation to modify the Soviet's use of the phrase "secondary" to mean derivative: "verbal language could be defined as the primary modeling system, the others being only 'secondary,' derivative (and partial) translations of some of its devices." The word derivative carries with it notions of antecedency, as well as implying secondary in importance.

Saussure, however, was clearly focused on the relative importance of language and presumably would have used the word "primary" to make a distinction of importance for language. In his "course in General Linguistics," he observes (Davis and Schleifer, 1989),

Language is a system of signs that express ideas, and is therefore comparable to a system of writing, the alphabet of deaf-mutes, symbolic rites, polite formulas, military signals, etc. But it is the most important of all these systems.

Lotman and Uspensky elaborated on the Soviet view of language as a primary modeling system in the 1970s and explained that, "language is viewed as carrying out a specific communicative function by providing the collective with a presumption of communicability." (Sebeok, 1991, p. 50) And that presumption seems to mean that communicability begins with language. Eco elaborates on that notion and makes the point that language is primary because language is how we express our thoughts: (1979, p. 172),

"it could be defined as the primary way in which man translates his thoughts, speaking and thinking being a privileged area of a semiotic inquiry, so that linguistics is not only the most important branch of semiotics but the model for every semiotic activity.

Which Came First: Words or Pictures?

But where does visual communication lie in this debate about the primacy of communication systems? Presumably the Soviets would see it as either a tertiary form of communication or a superstructure built on language. This conflicts with the view of Plato, among others, who has Socrates describe in the Phaedo two worlds: the first of which is a murky world of imperfection as seen through the tangled and inept medium of speech and the second being an "upper world" of perfection and light where all things are communicated visually, unmediated, and without the need for words. Burgin makes the point, however, that Plato's rather naive view of imagery still effects the way we talk and think about art and particularly about the ineffable purity and credibility of representative images--i.e. "seeing is believing.". (Burgin, 1983, p. 243)

Although, Christian Metz suggests in his classic work, "The Perceived and the Named," that, "It would be a fruitless quarrel that would initially seek to know whether it is language that informs perception or perception which informs language," he still comes down on the side of language as a primary system. (1980, p. 59-63) He argues that language, compared with "all nonlinguistic codes," is a metalanguage--i.e. a universal communication code that makes the exchange of information from other codes possible (principally formalized languages, mathematical notation, chemical notation). Language is used to introduce these codes, explicate them and define their field of validity. He clearly is on the side of the linguists:

"Every semiologist has noted that language, through its relationship with other codes, occupies a non symmetrical and privileged position in that it affects the quantitative extension of the material of the signified (the total field of 'things that once can say')."

He concludes that "Language can say, even if sometimes only with approximation, what all the other codes can say, while the inverse is not true." When he specifically discusses visuals he suggests that "language does much more than transcode vision;" in his view language is needed not only to explicate vision but to complete it--i.e. by supplying a name.

Sebeok, who clearly believes that a debate about which came first, language or perception, is not fruitless but important, speculates that language evolved in prehistory first as an adaptive function principally to enhance imitative signaling (a visual function)--the evolutionary focus was more on language-as-modeling-system rather than on speech-as-communication. Furthermore, at the time language evolved, homo habilis is thought to have had very sophisticated visual and nonverbal sign system repertoires but a primitive verbal system limited to gutturals rather than articulate, linear speech. In this sense, Sebeok argues that, "properly speaking, language itself is a secondary modeling system." He concludes that "the general belief that language replaced the cruder systems is totally wrong." (1991, p. 57, 71)

Child development scholars would agree that visual communication skills are not secondary, derivative, impure or peripheral and, in fact, develop earlier than verbal skills in children. Burgin, in his essay "Seeing Sense," notes that Freud, in line with Piaget's analysis of children's developmental processes, describes the primary processes as preceding the secondary processes in the mental development of the individual" and that the primary processes "are pre-verbal in origin and thus prefer to handle images rather than words." He continues, "Where words are handled they are treated as far as possible like images." (1983, p. 231) This mirrors the way the perceptual process work where, as Worth explains, the natural world presents itself directly to the information processing system. (1981, p. 171)

The willingness to privilege language as the primary communication system has also led to such works such as Neil Postman's criticism of television in which he argues that most of the world's current problems can be traced to the switch from written to visual communication--i.e. from reading books to watching television. A similar critique was delivered to the International Visual Literacy Association by Daniel Maher who pointed to a cultural shift that, in his view, has made the word subordinate to the image. He argues that the mass media, presumably television, "represents the epitome of the image dominating over the word" and, as a result, language is being reduced from, in his terms, a symbol system to a sign system. In other words, as a result of the image rather than the word containing the meaning, language is degenerating.

A more balanced view of the relationship between the two sign systems is Sebeok's model of a communication system based on two "mutually sustaining repertoires of signs, the zoosemiotic nonverbal, plus superimposed, the anthroposemiotic verbal." (1991, p. 55) Sebeok speculates that "the latter is the modeling system the Soviet scholars call primary but which, in truth, is phylogentically as well as ontogenetically secondary to the nonverbal." In other words, the primary/secondary or superstructure/substructure relationship does not hold between visual and verbal communication systems as it does between spoken and written language where written language is more clearly dependent upon the primacy of spoken language.

Array of Sign Systems

An important notion introduced in the preceding comments is the idea that human language is only one of many sign systems and within this broader world language is certainly not "primary." In many civilizations, particularly prehistorical and non-Western societies, forms of communication relying on other sensory systems are far better developed and language plays a less dominant role. In other words, there is a large world of communication sign systems and language-based sign systems are only one part of that universe.

Several writers, for example, have reviewed the breadth of communicative sign systems and identified a number of other semiotic based fields of study where meaning is communicated by means other than language. Eco builds his array "starting from the apparently more 'natural' and 'spontaneous' communicative processes (zoosemiotics, or animal communication systems) and going on to more complex 'cultural' systems such as language. The following terms, for example, are used to identify areas of communication study from the natural world (Eco 1979; Sebeok 1991): protosemiotics (the solar system), zoosemiosis (animal communication), and biosemiosis, which is a general term for all types of biological study including such areas as zoosemiosis (animal communication), endosemiotics (communication at the molecular and cellular level), phytosemiotics (vegetable systems, i.e. photosynthesis), and mycosemiotics (fungi signaling systems).

Anthroposemiosis, a form of biosemiosis, includes human communication and this is where human communication appears in this broad array of natural based communication systems. And as Sebeok noted, human communication includes language based systems, which he calls anthroposemiotic (both spoken and written language). as well as nonverbal or zoosemiotic systems. Actually zoosemiotic is defined as non-human or animal communication, so nonverbal is probably a better term, although it has its limitations since it is based on an oppositional logic.

To answer the question regarding where visual communication fits, the answer is that human communication is both anthroposemiotic (language based) and nonverbal. In the nonverbal area, for example, written language is communicated visually as are many other nonverbal language or notation systems (Braille, mathematical, musical and choreographic codes), symbolic systems (dress, cosmetics, heraldry, road signs, maps, engineering and architectural schematics, algebra, chemistry tables) and many forms of what we call visual communication (film codes, color systems, layout, composition, aesthetics, etc.). In areas more closely tied to what Sebeok was referring to when he used the word zoosemiotics, we find other sensory communication systems, such as kinesics and proxemics, and sensory codes such as the language of perfume, as well as indexical and iconographic sign recognition. This variety of communication systems that are not language based, however, illustrates Sebeok's point about the "vulgar error."

Relatively speaking, human language-based communication actually falls rather far down on the array of sign systems. By privileging human language, some might argue that we are actually guilty of speciesism.

The Factor of Abstraction

Most scholars who see language as the primary sign system (for humans) do so because of the complexity and richness of language and its ability to express abstract as well as reality-dependent information. As Eco analyzes this line of reasoning, he seems to come down on both sides of the fence. First, he points out that, only language has the quality of "effability," which he explains as follows (1979, p. 172):

The effability power of verbal language is undoubtedly due to its great articulatory and combinational flexibility, which is obtained by putting together highly standardized discrete units, easily learned and susceptible to a reasonable range of non-pertinent variations.

By that he is suggesting that every human experience, as well as thought, can be expressed through verbal language, while the opposite is not true. However, he also points to experiences that can't, in fact, be expressed well in language, such as the "meaning" of certain Neapolitan gestures. In fact, he admits that there is a vast realm of "unspeakable" but not "unexpressible" content. However, he finally concludes that non-linguistic communication is also important, if not as important as language, and calls for a broader semiotic inquiry into other legitimate sign systems recognizing that verbal language is not "the privileged vehicle for thought alone." (1979, p. 174)

The question of where thought occurs is fuzzy because the perceptual and cognitive systems are interwoven. The perceptual process (all senses, not just visual or auditory) is engaged before information is processed. Processing involves the manipulation of concepts and reasoning. Thoughts are often defined as "mental representations" which suggests the importance of visual perception and memory in thinking. Gibson explains how thought operates at both the verbal and visual levels (1971, p. 34):

Not only do we perceive in terms of visual information, we also can think in those terms. Making and looking at pictures helps us to fix these terms. We also can think in terms of verbal information, as is obvious, and words enable us to fix, classify and consolidate our ideas. But the difference is that visual thinking is freer and less stereotyped than verbal thinking: there is no vocabulary of picturing as there is of saying. As every artist knows, there are thoughts that can be visualized without being verbalized.

There are actually two views on how thoughts are stored, as Miller and Burton explain (1994). The first position based on the concept of mental representations suggests that these "thoughts" are stored as imagery internally coded in a spatial structure. The second view, which is derived from Lotman's "language of thought" hypothesis, is that they are stored as propositions encoded verbally in linear or sequential order. Although thought can involve both pictorial and semantic elements, it is probably best described as a conceptual process that moves beyond both words and pictures and into an abstract meaning-based format or platform for managing ideational relationships.

Reasoning is thought by many to be primarily verbal because of the power of classical logic which drives the analysis of deduction and induction through verbal framing. But reasoning does not have to be verbally based, as in the case of aesthetics, theoretical mathematics, and ideation or creative thinking.

Geisser has studied logical reasoning, particularly in deaf people, and concludes that the deaf rely more on visual processing and thus have more problems with inferential reasoning. (1991) In contrast, semioticians argue that the Peircian concept of abduction, which is reasoning based on hypothesis building from clues in the natural environment, is a good indication of non-language based inferential reasoning. Abduction reasons from what Hoopes calls "statistical inference" (1991) which relies more on educated guessing than rules of logic (induction, deduction), but, regardless, is a highly developed form of inference building that originates primarily from visual and other sensory cues. Eco and Sebeok in The Sign of Three refer to it as "the conjectural paradigm" and point to its use in medical diagnosis and detective fiction. (1983) Sherlock Holmes, whose author, Conan Doyle, was trained as a doctor, is an example used by Eco and Sebeok to demonstrate how abductive reasoning is based on observations and inferences.

Gregory, in his book, The Intelligent Eye, doesn't discuss abduction, but he does base a good deal of his theory of visual thinking on the concept of "hypothesis objects," which suggests that we puzzle out meaning through a structured process of hypothesis building about the things we are perceiving. These "units of seeing and thinking" are based on the objects we look at, although the units themselves are not necessarily words or pictures. (1970)

The role of syntax in language is another point in the debate about the ability of visual thinking to deal with complexities and abstractions. Worth, in his classic article, "Pictures Can't Say Ain't," makes the argument that visuals cannot deal with what is not. His point is that negation is dealt with in language through syntax and there is nothing in the visual information system that parallels that construct. He does mention, however, illusions and impossible pictures which certainly demonstrate that visuals can deal with the unimaginable, as well as the impossible. And certainly the ubiquitous circle-with-a-slash symbol has been adapted to many situations of negation. Even if Worth is correct, the lack of negation does not necessarily mean that visual perception has no role in abstract thinking.

Given the work by visual communication and semiotic scholars in the areas of inference, hypothesis building, and abductive reasoning, it's difficult to conclude that visual perception and thinking are not important in abstract reasoning and that abstraction is tied exclusively to language-based information processing.

The Arbitrary Factor

The base of the linguistic analyst's primary modeling system is spoken language; writing is a superstructure built on this base. Both forms of language--spoken as well as written--represent highly arbitrary sign systems that have to be learned in order to be interpreted. The arbitrary factor is both a measure of the complexity of the system but also an indication of the socio-cultural system available to help with linguistic interpretation, a support system which is largely lacking in visual perception and communication.

Verbal messages are highly conventionalized; we are taught to speak and read these signs through extensive educational protocols either managed at home by family (spoken language) or at school (written language). Visual signs are more often iconographic or indexical and those forms of interpretation are, as Sebeok noted in a discussion of nonverbal communication, largely "wired in" rather than arbitrary and conventional. (1991, p. 65) Messaris makes the argument convincingly in his book Visual Literacy that we understand graphic, film, and video images, not through learning a code but by transferring real world interpretational processes that we use in everyday perception. (1994)

Our natural perceptual processes, in other words, govern much of our learning. Even visual signs that are arbitrary, such as street signs, are rarely formally taught to us. We acquire visual competency through development and experience rather than training--in other words, for our basic nonverbal and visual communication skills we are largely self-taught and that includes such sophisticated skills as making sense of MTV, watching multiple television channels simultaneously, and negotiating freeways as we puzzle out a map.

The "hard wiring" of the basic visual perception processes, however, doesn't suggest that visual interpretation is an unsophisticated form of understanding any more than the concept of deep structure suggests that language learning is less complex or sophisticated. In fact, Miller and Burton even speculate that the deep structure of language developed from perception. (1994, p. 72)

Even though visual perception and thinking may be self taught, our skill levels in certain areas that are important to us, such as making sense of MTV or reading a map or schematic, can be extremely high. We do need training, however, if we want to deliberately develop more specialized visual skills in order to track something in the woods, puzzle out the images of the cells we see in a microscope, read a negative or X-ray, or understand a Picasso painting. If we want to become proficient in producing visual messages, in almost every case we need specialist training. Some people are born painters, and most people have some limited ability to sketch, but most visual sign production calls for training.

Another aspect of the complexity of visual sign interpretation is the lack of a system for assigning conventional meanings for visual communication elements that are arbitrary, such as visual symbols. This suggests that visual information is subject to more active personal interpretation--more so than with language. As Gibson says, visual perception is "richer and more inexhaustible," i.e. more open to personal interpretation.

The interpretation of visual information, like semiotic approaches to meaning interpretation, are therefore highly subjective and highly projective, which puts more demands on a viewer than on a listener or reader. Because of the resemblance factor for icon interpretation and the experience factor for index interpretation, the formal training may be needed less than for language--although the life experience may be more demanding if there were a way to measure such a thing-- but regardless, the visual and nonverbal systems operate relatively untutored in our society, at least in comparison to language. With visuals we are much more on our own, both in learning and in interpreting, and that's why I believe visual learning in our contemporary society is equally as challenging an accomplishment as verbal learning.

The Argument for the Primacy of Visual Communication

For these reasons, I would like to suggest that visual communication is as much a primary system as verbal language, and that language based communication has been inappropriately privileged in contemporary Western culture. This is not to say that visual communication is more important, or language less so. The argument here is that an equally important form of communication--visual communication--has been ignored because of the strong emphasis our culture and the academy have placed on language.

In terms of development, the visual sign system is antecedent to language. In terms of complexity, visual interpretation can be seen as being more complex than verbal interpretation, primarily because of the lack of a conventionalized sign system and a formalized training protocol. Visual communication could also be considered primary because the viewer has to learn as well as manage more of the visual interpretative function independently. Finally, visual communication is neither derivative, nor peripheral to language, and therefore the designation of "secondary," "tertiary" or a "superstructure" built on language is inappropriate.

A more appropriate model would be one built on Paivio's notion of dual coding which states the visual and verbal information are encoded and decoded by separate specialized perceptual and cognitive systems. (1971, 1986 ) One system is visual/pictorial and manipulates the elements of imagery simultaneously; the other is linguistic and propositional and operates in sequence. As Miller and Burton explain, "The two systems are assumed to be structurally and functionally distinct." (1994, p. 73) Although independent, the two subsystems are also interdependent so that a visual concept can be converted into a verbal label and vice versa. A more recent approach to explaining the interaction between the two systems is the metaphor of interactive parallel processing.

We'll close this argument with another quote from Sebeok who has concluded that we should broaden our concept of communication: "Our habit of thinking of communication as consisting exclusively of language has delayed the study of communication." (1991, p. 57). By redefining the notion of a "primary" system and including visual communication as well as verbal, we may move further towards a more thorough analysis of the complexities of communication.




References
Burgin, Victor. 1983. "Seeing Sense." In Howard Davis and Paul Walton, eds. Language, Image and Media. New York: St. Martin's Press.

de Saussure, Ferdinand. 1989. "The Object of Linguistics" from "Course in General Linguistics" as reprinted in Robert C. Davis and Ronald Schleifer, eds. Contemporary Literary Criticism: Literary and Cultural Studies. New York: Longman.

Eco, Umberto. 1979. A Theory of Semiotics. Bloomington: Indiana University Press, Midland Book Edition.

Eco, Umberto. 1986. Semiotics and the Philosophy of Language. Bloomington: Indiana University Press, Midland Book Edition.

Eco, Umberto and Thomas A. Sebeok, ed. (1983). The Sign of Three. Bloomington: Indiana University Press.

Gibson, J.J. (1971) "The Information Available in Pictures." Leonardo. 4 (Winter), p. 27-35.

Gregory, R. L. 1970. The Intelligent Eye. New York: McGraw-Hill.

Hoopes, James. 1991. Peirce on Signs. Chapel Hill NC: The University of North Carolina Press.

Maher, Daniel R. 1990. "MTV and Sesame Street: Visual Images and Language," Investigating Visual Literacy, Proceedings of the 1989 IVLA conference.

Messaris, Paul. 1994.Visual Literacy: Image, Mind, & Reality. Boulder CO: Westview Press.

Metz, Christian. 1980. "The Perceived and the Named," Visual Communication 6:3, 56-68.

Pavio, Allan. 1971. Imagery and Verbal Processes. NY: Holt, Rinehart and Winston.

Pavio, Allan. 1986. Mental Represenations: A Dual Coding Approach. Oxford: Oxford University Press.

Sebeok, Thomas A. 1991. A Sign is Just a Sign. Bloomington: Indiana University Press.

Worth, Sol. "Pictures Can't Say Ain't." Chapter 7 in Studying Visual Communication. Sol Worth and Larry Gross, eds. Philadelphia: University of Pennsylvania Press.

abouve copied from: http://spot.colorado.edu/~moriarts/primelang.html

No comments: