Semantics
Semantics (Greek sēmantikos, giving signs, significant, seebma symptomatic meaning, from sēma (σῆμα), sign) refers to aspects of meaning, as expressed in language or other systems of signs. Semantics contrasts with syntax, which is the study of the structure of sign systems (focusing on the form, not meaning). When analyzing languages, an analysis can be said to cover both the "syntax and semantics" concerning both the format and meanings of phrases in a language. The term semantics can apply not only to natural languages, such as English, German or Latin, but also to technical languages, such as a computer programming language.
Related to semantics is the field of pragmatics, which studies the practical use of signs by agents or communities of interpretation within particular circumstances and contexts.[1] By the usual convention that calls a study or a theory by the name of its subject matter, semantics may also denote the theoretical study of meaning in systems of signs.
Semanticists generally recognize two sorts of meaning that an expression (such as the sentence, "John ate a bagel") may have: (1) the relation that the expression, broken down into its constituent parts (signs), has to things and situations in the real world as well as possible worlds, and (2) the relation the signs have to other signs, such as the sorts of mental signs that are conceived of as concepts.
Most theorists refer to the relation between a sign and its objects, as always including any manner of objective reference, as its denotation. Some theorists refer to the relation between a sign and the signs that serve in its practical interpretation as its connotation, but there are many more differences of opinion and distinctions of theory that are made in this case. Many theorists, especially in the formal semantic, pragmatic, and semiotic traditions, restrict the application of semantics to the denotative aspect, using other terms or completely ignoring the connotative aspect.
Contents |
Etymology
Semantics is derived from the Greek "σημαντικός" or semantikos, meaning "significant". The word semantic appears in French as sémantique, as used by Michel Bréal during the 19th century, in his 1897 book published in Paris, Essai de sémantique, considered the first use of the term "semantics" in the modern sense.
Linguistics
In linguistics, semantics is the subfield that is devoted to the study of meaning, as borne on the syntactic levels of words, phrases, sentences, and even larger units of discourse (referred to as texts). As with any empirical science, semantics involves the interplay of concrete data with theoretical concepts. Traditionally, semantics has included the study of connotative sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax.
The decompositional perspective towards meaning holds that the meaning of words can be analyzed by defining meaning atoms or primitives, which establish a language of thought. An area of study is the meaning of compounds, another is the study of relations between different linguistic expressions (homonymy, synonymy, antonymy, polysemy, paronyms, hypernymy, hyponymy, meronymy, metonymy, holonymy, exocentric, and endocentric).
The dynamic turn in semantics
This traditional view of semantics, as a finite meaning inherent in a lexical unit that can be composed to generate meanings for larger chunks of discourse, is being fiercely debated in the emerging domain of cognitive linguistics[2] and also in the non-Fodorian camp in Philosophy of Language[3]. The challenge is motivated by
- factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this X, him, last week). In these situations "context" serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences are viewed as context-change potentials instead of propositions.
- factors external to language, i.e. Language is not a set of labels stuck on things, but "a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things."[3] This view reflects the position of the later Wittgenstein and his famous game example, and is related to the positions of Quine, Davidson and others.
A concrete example of the latter phenomenon is semantic underspecification — meanings are not complete without some elements of context. To take an example of a single word, "red", its meaning in a phrase such as red book is similar to many other usages, and can be viewed as compositional[4]. However, the color implied in phrases such as "red wine" (very dark), and "red hair" (coppery), or "red soil", or "red skin" - are very different. Indeed, these colours by themselves would not be called "red" by native speakers. These instances are contrastive, so "red wine" is so called only in comparison with the other kind of wine (which also is not "white" for the same reasons). This view goes back to de Saussure:
- Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid') has its particular value only because they stand in contrast with one another. No word has a value that can be identified independently of what else is in its vicinity.[5]
and may go back to earlier Indian views on language, especially the Nyaya view of words as and not carriers of meaning[6].
An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the Generative Lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated on the fly based on finite context.
Prototype theory
Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch and George Lakoff in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members.
Systems of categories are not objectively "out there" in the world but are rooted in people's experience. These categories evolve as learned concepts of the world —meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the "grounding of our conceptual systems in shared embodiment and bodily experience"[7]. A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate (see the Whorf-Sapir hypothesis or Eskimo words for snow).
Computer science
In computer science, considered in part as an application of mathematical logic, semantics reflects the meaning of programs or functions.
In this regard, semantics permits programs to be separated into their syntatical part (grammatical structure) and their semantic part (meaning). For instance, the following statements use different syntaxes (languages), but result in the same semantic:
- x += y; (C, Java, etc)
- Let x = x + y;
- or x = x + y (various Basics)
Generally these operations would all perform an arithmetical addition of 'y' to 'x'.
Semantics for computer applications falls into three categories[8]:
- Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.
- Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.
- Axiomatic semantics: Specific properties of the effect of executing the constructs as expressed as assertions. Thus there may be aspects of the executions that are ignored.
The Semantic Web refers to the extension of the World Wide Web through the embedding of additional semantic metadata.
Psychology
In psychology, semantic memory is memory for meaning, in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience, while episodic memory is memory for the ephemeral details, the individual features, or the unique particulars of experience. Word meaning is measured by the company they keep; the relationships among words themselves in a semantic network. In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the network are few in number and kind; and include "part of", "kind of", and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, neural networks and predicate calculus techniques.
Semasiology
In International Scientific Vocabulary semantics is also called semasiology.
References
- ^ Otto Neurath (Editor), Rudolf Carnap (Editor), Charles F. W. Morris (Editor) (1955). International Encyclopedia of Unified Science. Chicago, IL: University of Chicago Press.
- ^ Ronald W. Langacker (1999). Grammar and Conceptualization. Berlin/New York: Mouton de Gruyer. ISBN ISBN 3110166038.
- ^ a b Jaroslav Peregrin (2003). Meaning: The Dynamic Turn. Current Research in the Semantics/Pragmatics Interface. London: Elsevier.
- ^ P. Gardenfors (2000). Conceptual Spaces. Cambridge, MA: MIT Press/Bradford Books.
- ^ Ferdinand de Saussure (1916). The Course of General Linguistics (Cours de linguistique générale).
- ^ Bimal Krishna Matilal (1990). The word and the world: India's contribution to the study of language. Oxford. The Nyaya-Mimamsa the centuries-long debate on whether sentence meaning arises through composition on word meanings, which are primary; or whether word meanings are obtained through analysis of sentences where they appear, is discussed in chapter 8.
- ^ George Lakoff and Mark Johnson (1999). Philosophy in the Flesh: The embodied mind and its challenge to Western thought. Chapter 1.. New York: Basic Books..
- ^ Nielson, Hanne Riis & Nielson, Flemming (1995), Semantics with Applications , A Formal Introduction (1st ed.), Chicester, England: John Wiley & Sons, ISBN 0-471-92980-8.
See also
Major philosophers and theorists
- Alfred Tarski
- Rudolf Carnap
- P.F. Strawson
- H.P. Grice
- J.L. Austin
- Keith Donnellan
- Charles E. Osgood
- Saul Kripke
- John Perry
- Nathan Salmon
- Scott Soames
- David Kaplan
- Nelson Goodman
- Jürgen Habermas
- Ray Jackendoff
- John Lyons
- Richard Montague
- Charles Sanders Peirce
- C.K. Ogden
- I.A. Richards
- Benjamin Whorf
- Anna Wierzbicka
- S. I. Hayakawa
- Alfred Korzybski
Linguistics and semiotics
- Colorless green ideas sleep furiously
- Computational semantics
- Discourse representation theory
- General semantics
- Natural semantic metalanguage
- Onomasiology
- Pragmatic maxim
- Pragmaticism
- Pragmatism
- Semantic change
- Semantic class
- Semantic feature
- Semantic field
- Semantic lexicon
- Semantic progression
- Semantic property
- Semeiotic
- Sememe
- Semiosis
- Semiotics
Logic and mathematics
- Formal logic
- Game semantics
- Model theory
- Proof-theoretic semantics
- Semantics of logic
- Semantic theory of truth
- Truth-value semantics
Computer science
- Formal semantics of programming languages
- Semantic HTML
- Semantic integration
- Semantic link
- Semantic service oriented architecture
- Semantic spectrum
- Semantic analysis
- Semantic Reasoner
External links
- Teaching page for A-level semantics
- Intellexer, software for Semantic Analysis of Text