CHAPTER IINTRODUCTIONIn the 1950s the school of linguistic
thought known as transformational-generative grammar received wide
acclaim through the works of Noam Chomsky. Chomsky postulated a
syntactic base of language (called deep structure), which consists
of a series of phrase-structure rewrite rules, i.e., a series of
(possibly universal) rules that generates the underlying
phrase-structure of a sentence, and a series of rules (called
transformations) that act upon the phrase-structure to form more
complex sentences. The end result of a transformational-generative
grammar is a surface structure that, after the addition of words
and pronunciations, is identical to an actual sentence of a
language. All languages have the same deep structure, but they
differ from each other in surface structure because of the
application of different rules for transformations, pronunciation,
and word insertion. Another important distinction made in
transformational-generative grammar is the difference between
language competence (the subconscious control of a linguistic
system) and language performance (the speaker's actual use of
language). Although the first work done in
transformational-generative grammar was syntactic, later studies
have applied the theory to the phonological and semantic components
of language.In linguistics, a transformational grammar or
transformational-generative grammar (TGG) is a generative grammar,
especially of a natural language, that has been developed in the
syntactic structures of phrase structure grammars (as opposed to
dependency grammars). Transformational grammar is the tradition of
specific transformational grammars. Much current research in
transformational grammar is inspired by Chomsky's Minimalist
Program.
CHAPTER IIDISCUSSIONA. Revision of Syntactic Structures
ModelAfter a slow start, transformational-generative grammar
finally took hold. Chomsky and others worked at extending and
refining the early theoretical model, and eventually arrived at
enough modifications and revisions that the original theory had to
be formulated. In addition to the work on the TG model, Chomsky
devoted much time to enlarging and refining the assumptions and the
philosophical views which had led him to develop a TG grammar in
the first place, and to defending his views to a still-skeptical
group of linguists.One of the most controversial of these
assumptions was Chomskys assertion that the linguist must rely on
the linguistics institutions of native speakers. Most structuralist
balked at the word intuition, and they condemned Chomskys method as
subjective, unscientific, and the circular.Chomsky answered such
charges by insisting first of all that it is far more dangerous to
view objectivity as an end in itself. If the goal is insight and
understanding, objectivity is used merely insight as a tool for the
search. Besides, as with any science, it scarcely makes sense to
begin by demanding the very evidence you are looking for. He
admitted that there are certainly problems inherent in relying on
intuition, but at the moment he knew of no better way to begin. As
for reliance on intuition as the ultimate test of his theory,
Chomsky answered that if linguistic intuition is what the linguist
seeks to explain, there is no better way to test result than to see
if they are satisfactory explanation to native speakers.Still not
convinced, the structiralists retorted that if Chomskys grammar has
as its aim the study of intuition, then its results will not tell
us anything we do not already know, since we are all native
speakers. They accused Chomsky of describing intuition, not
grammar. Chomsky agreed that intuition was indeed precisely what he
was describing. His empirical data were the native speakers
linguistic intuitions, for which his grammar was seeking an
explanation. The goal of transformational generative grammar theory
is to gain insight into and understanding of the nature of the
language users intuitive linguistic judgments.Chomsky pointed out
that the study of linguistic intuition poses many problems,
however. For example, a speakers knowledge that a given sentence is
or is not grammatical is most often a tacit knowledge, below the
level of conscious awareness and not therefore immediately
available to him. The task of linguist, as Chomsky saw it, was to
provide explanations and to present them in such a way that a
language users linguistic consciousness would be raised to the
level of awareness. He was not suggesting, however, that such
consciousness-raising would involve teaching the speaker anything
new about language; the goal is to find ways of pointing out things
that language users have known all along, of making them aware of
their considerable linguistic intuition.To illustrate the point
that explanations are not always ones to which a competent speaker
has ready access, Chomsky discussed the problem ambiguity. At the
first impression the speaker-hearer may not, for a number of
reasons, recognize a sentence as ambiguous. Or he or she may
realize that it can be interpreted in two ways but miss a third of
forth. For example:I had a book stolen at the libraryAt first
glance, most speakers of English will probably recognize that the
sentence is ambiguous. Yet, they will very likely not be aware that
there are at least five ways it can be interpreted-and possibly
more:1. A book of mine was stolen while I was at the library.2. A
book of mine was stolen while it was at the library.3. I arranged
for someone to steal a book while it was at the library.4. I had in
my possession a book which had been previously stolen at the
library.5. I almost completed the stealing of a book at the library
(but was caught).Another interesting fact about language use on
which linguists have focused in recent years is that the speakers
linguistic performance (the actual use of language in concrete
situation) is different from and seldom fully reflects linguistic
competence (the speaker-hearers knowledge of his language). The
actual sentences native speakers utter can tell us very little
about their language competence, for in the normal course of daily
living it is inevitable that all kind of distraction are bound to
interfere with linguistic performance. In other words, there are
limits to performance that have nothing to do with grammatical
ability.It was the realization that led Chomsky to reject the
structuralists data-collecting approach to language study by
deliberately confining themselves to a description of actual spoken
utterances; the structuralists limited themselves to the study of
linguistic performance.The research of Chomsky and other
transformationalists into linguistic competence led them into turn
to new inquiries. As soon as one reflects upon the actual mish-mash
performance (of others) from which a child inevitably learns his
language, it becomes all the more astonishing that language
acquisition is possible at all. Transformational linguists, like
Plato to Descartes before them, came to marvel at the awesome fact
that every normal child is able to acquire so undeniably complex an
ability as language competence. Chomsky had four principal aims in
writing aspects:1. To clear up misunderstandings and to answer the
questions that had been raised about TG theory, especially those
raised by the strucuralists;2. To point out the weakness and
defects in the early model, along with the arguments and evidence
that had convinced him of validity of some criticisms;3. To suggest
revisions and modifications which would merely these defects;4. To
call attention to unresolved problems still in need of
investigation.Among those things which made some linguists unhappy
withthe early form of transformational grammar was its failure to
deal satisfactorily with the problem of meaning and meaning
relationships. Many linguists came to feel that meaning is basic to
language competence, and that therefore the grammar theory ought
somehow to incorporate semantic considerations in the
phrase-structure part of the grammar.Chomsky had argued in
Syntactic Structures that a grammar is best formulated as a
self-contained syntactic theory, wthour reference to and
independent from considerations of semantics. This was not to deny
the importance of meaning in language. He simply believed that once
one discovers the syntactic structure of a language, that knowledge
can be put to use in discovering the meaning function of the
language. Robert Lees, writing in 1960, argued that the negative
statement should be regarded as the derivation of a kernel string
rather that as a tranformation. Edward Klima also presented a
similar formal argument regarding the interrogative sentence. Both
of these sentence types had been included among the single-base
optional transformations in the early TG model. But these linguists
contended that, since there is no known language in which
thequestion and the negative statement fail to exist this
linguistic facts shuold be reflected in the basic rules of the
grammar. In effect, what they were arguing was that all optional
transformations should be meaning-preserving. To undestanding this
reasoning, consider the following five sentences:1. John always
eats lunch.2. John does not ever eat lunch.3. John never eats
lunch.4. John always eats no luch.5. John does not always eats
luch.In the early form of TG grammar, all five of these sentences
would be analyzed as derivations from the same underlying kernel
string, John + eat + lunch. The intuition of the English speaker,
however, is that there are at least three separate meanings or
interpretationsof these five sentences. At first, Chomsky was
skeptical. But then, Jerrold Katz and Paul Postal worked out the
outlines of a transformational semantic theory which demonstrated
the feasibility of assigning semantic features to particular
lexical items, and which also demonstrated that the syntactic
structure of a sentence is often influenced by the semantic
features associated with a particular lexical choice. They
speculated that the device of assigning features might make it
possible to simply get rid of the notion of optional generalized
transformations altogether, and to account for the generation of
compound and complex sentences directly from the PS rules,
instead.Chomsky had accepted the notion that all transformations
should be meaning-preserving. Now he conceived a revised model with
a base component called the deep structure. The base component
would include syntactical rules, semantic and phonlogical
information represented by feature matrixes of lexical items, and
phrase markers (NEG, Q, PAS). All sentences would the be generated
directly from the deep structure, or base, by means of various
transformation operations, to become actual sentences or surface
structures.The idea of making revisions through the early TG model
was appealing, for what it would mean is that all the abstract
material contained in the deep structure (the base) of the grammar
would represent linguistic universals. Only the transformation
operations would give instructions for the idiosyncractic forms of
particular languages. Another advantage of the revised model is
that the linguitic preperty of recursiveness, which the Syntactic
Structures model had for the most part assigned to the
transformation rules would now be completely accounted for in the
base. This would mean, for example, that whenever the constituent
NP appears in a deep structure derivation, we would have the option
of embedding an S (sentence) after it.The most serious problem
encountered in reformulating the grammar model was that of deciding
how to include both semantic and syntactic information in the deep
structure rules. Chomskys solution (in Aspects) was to continue to
consider to consider the syntax rules primar. Semantic rules would
then be merely interpretive. The difficulty with keeping the two
components separate in this way, however, is that linguists cannot
agree, even today, on where the line between semantic and syntax
should be drawn. Clearly both syntactic features and semantic
features are important, but in some cases it isnt clear whether a
feature is a syntactic one, a semantic one, or both.Other
Unresolved Problems In Aspects Chomsky expressed the belief that in
all probability these questions will remain unanswered for some
time. It isnt clear, for example, how grammar can account for the
kind of semantic considerations that are beyond the scope of the
lexicon; nor is it clear whether certain semantic considerations
are universals or are, rather, particular language idiosyncrasies.
Another unresolved problem is that of deciding how to explain the
derivational process. A futher problem is that of idiom.B. Standard
TG Grammar Theory: Aspects Model The aspects TG grammar model has
three major components: a syntax, a semantics, and a phonology.
From these three, syntax is the central. It contains a base
component and a transformational component. The base component
contains a finite set of phrase-structure rules (both branching
rules and subcategorization rules), a lexicon or dictionary (also
finite), and some preleminary context-free lexical insertion rules
(L rules). The transformational component contains
context-sensitive transformational rules of three types: lexical
insertion rules, general transfomation rules (these are the
familiar optional T rules of the syntactic stuctures grammar), and
the two kinds of local transformation rules: affix-incorporation
and segment structure T rules. The rules of the base component are
said to be context-free, which means that each one of them applies
in ignorance of any other rules.the transformation rules, on the
other hand, are by their very nature context-sensitive: they apply
only in certain restricted environments.The semantic component and
the phonological component of aspects grammar are said to be
interpretive. The semantic component operates on the deep structure
level; it determines a semantic interpretation of a sentence
generated by the rules of the syntactic component. In other words,
the semantic component takes as input the information generated by
the base rules of the grammar and assigns a semantic or meaning
interpretation to the string.The function of the phonological
component is also interpretive. It provides information concerning
the pronunciation of constituents. That is, once all
transformations have been performed, the phonological component of
the grammar finishes the job of converting a deep structure to a
surface structure (an actual spoken sentence) by assigning
pronunciation features to it.The Total Grammar SystemI. SYNTACTIC
COMPONENTThe Base (Context-Free)A. Phrase-Structure Rules1.
Branching Rules2. Subcategorization Rulesa. Strict Subcategrization
Rulesb. Selectional RulesB. The Lexicon and Preliminary Lexical
Insertion RulesC. Transformation Rules (Context-Sensitive)1. Final
Lexical Insertion T Rules2. General T Rules3. Local T Rulesa.
Segment Transformationsb. Other Local TransformationsII. SEMANTIC
COMPONENTOperates on the base component. Influences
subcategorization rules and lexicon, and assigns a semantic
interpretation to the deep structuregenerated by the PS rules.III.
PHONOLOGICAL COMPONENTContributes phonological feature matrix
information to the lexicon. After application of all T rules,
provides a phonological interpretation for the surface
structure.SEMANTIC COMPONENT(Interpretive)
Deep StructureSYNTACTIC COMPONENT
The BaseTransformation RulesFinal Lexical Insertion T
RulesGeneral T RulesLocal T Rules
Base (PS) Rules Branching rules
Subcategorization rulesStrict subcat rulesSelectional rulesThe
LexiconPreleminary Lexical Insertion
Surface Sentence(Phonetic Structure)PHONOLOGICAL
COMPLEMENT(Interpretive)
C. Aspects Model: The Base Component, IThere are two aspect of
about the base component of context-sensitive transformation rules
(1) Phrase structure rules two types of branching rules and sub
categorization phrase structure rules, and (2) the lexicon rules.
We shall be applying these rules the order in which they are
listed:Phrase Structure RulesBranching Rules or RewritingThe
branching rules is context-free rules which act blindly to produce
any one of a number of strings terminating in category nodes. The
generation of a sentence begins with the rewriting or branching
rules. You are already familiar with this kind of rule, because we
used branching rules in the earlier Transformational Generative
model. A few significant changes have been made in the branching
rules of the aspect grammar. One important change is that of
including transformation signaling abstract phrase markets in the
phrase structure rules of the grammar. The rewriting rules of the
revised grammar contain, in addition to formative constituents,
abstract phrase markets Q, NEG, PAS, and so on which will not be
realized as actual words or word parts. These phrase markers, which
appear only in the deep structure, provide semantic information to
the semantic component and trigger a particular transformation
process at some point in the generation of a sentence. In addition
to the old single-base optional transformations, now signaled by
abstract markers, the revised phrase structure rules have also
moved the explanation of the multiple-base transformations to the
base component of the grammar. Thus, whereas in the early model the
universal language property of indefinite reclusiveness was wholly
accounted for by the optional double base transformation rules,
this property can now be explained by the branching rules of the
base component. These are major changes all transformational signal
to the phrase structure rules of the base the grammar is to move
with the result that the base rules are now capable of explaining a
much deeper level of abstraction than was permitted by the earlier
phrase structure rules. Moreover, and this is the real
justification for the change the base rules are now able to reflect
at least two properties of language which transformational
linguists recognize as linguistic universals.The other is the fact
that all known natural languages make use of transformations or to
put it another way, every sentence in every known natural language
is a transformation. These rule changes mean that we will no longer
speak of optional transformations at least not in the same sense as
we used that term in the early transformational generative model.
The very abbreviated deep structure tree diagrams illustrate some
of these rule change.Figure T1Negative Transformation Deep
Structure
Negative Surface Structure: Janice may not like this book.Figure
T2Question TransformationDeep Structure
Question Surface Structure: Will the woman be happy?Figure
T3Passive Transformation Deep Structure
Passive Surface Structure: The roadwas ruined by the
tractorFigure T1, T2, and T3 are tree structures which illustrate
the revised rewriting rule for sentence:S(SM) NP VPSM (NEG),(Q),
(PAS)The first of these rules says that preceding an entire string
there may occur one or more abstract phrase markets or sentence
modifiers (SM). The second rule explains that a sentence modifier
may be one or more abstract dummy symbols like NEG (negative), Q
(question), and PAS (passive). All such abstract phrase markers
will continue to appear in the phrase marker tree until we reach
the stage in the sentence generation process for the application of
the general Transformation rules. This procedure will not take
place until after all phrase structure rules have been run through
and lexical insertion transformations have taken place.This revised
derivational concept has two advantages:1. It permits us to see
that two sentences like Janice may like this book and Janice may
not like this book are different in meaning at the deep structure
semantic level.2. It also provides an explanation for the speakers
intuition that the sentences are otherwise syntactically
identical.Figure 4Embedded Restrictive Relative Clause
Transformations Deep Structure
Surface Structure: The girl who lost the ticket criedFigure 5The
triangle represents an approximated structure
Surface Structure: The fact that Tom likes books is true After
Reduction: That Tom likes books is trueFigure 6Embedded Nominalized
that clause deep structure
Surface structure: the fact that Tom likes books is trueAfter
reduction T: That Tom likes books is true. The tree structures in
Figures T4, T5, and T6 illustrate sentence embedding, one of the
two sentence combining processes. In each of the three diagrams
there are two underlying deep structure sentences, one of which
will become embedded, the other of which will dominate by the time
the structure surfaces. Notice that these phrase marker trees offer
a satisfying explanation for the structure fact that one string
(the main clauses) is felt to be more important than the other (the
subordinate clause). In all such derivations, only the main clause
will have immediately branched off from the original S node. All
embedded sentences, on the other hand, are shown to be immediately
dominated by an NP node.Figure 7Conjunction Transformation Deep
Structure
Surface structure: Mary dances and Jane singsFigure
8Nonrestrictive relative clause conjunction transformation Deep
structure
Surface structure: John, who is my friend, likes televisionAfter
reduction: John, my friend, likes televisionThe last two tree
diagrams Figures T7 and Figure T8 are illustrate the second
sentence combining process, conjunction. Figure T7 shows the deep
structure combining for two equally dominant surface structure main
clauses. Notice that each of them branches off from the original S
node simultaneously, and thus each clause has a separate but equal
existence of its own from its initial inception. Moreover, neither
sentence is immediately dominated by an NP (noun phrase) node (as
in the case of an embedded sentence) but rather by an S (sentence)
node.Figure T8 illustrates a sentence which will surface with an
embedded nonrestrictive relative clause. This kind of sentence,
most linguists thought, had simply not been adequately explained by
the old rewriting rules, for these early rules had treated
restrictive and no restricted relative clauses as if they are
alike. Although they are somewhat alike, the English speaker also
knows that there is an important difference both in the meaning and
in the pronunciation of the two sentences: The girl who is wearing
the red dress was late. Joyce, who is wearing the red dress, was
late.The relative clause in sentence 1 is felt by the speaker to be
a vital and intimate part of the dominant subject noun phrase. It
is essential to have this relative clause modifier if we are to
understand which girl the sentence is talking about and grammar
have recognized this fact by labeling such a relative clause
restrictive. In sentence 2 on the other hand, the relative clause
is not essential. It is not necessary to restrict the subject noun
with a modifier which further identifies it, for the proper name
Joyce is specific identification enough (the assumption being, of
course that there is not more than one Joyce in the context in
which the sentence is spoken).Joyce and Joyce is wearing the red
dress was lateOrJoyce (Joyce is wearing the red dress) was
late.Thus the tree diagram shows the nonrestrictive relative clause
as branching off immediately from the main Sentence node (as a
conjoined sentence does) but from the same S node and at the same
time as its fellow triplet constituents, NP and
VP.Subcategorization PS RulesWhich define the syntactic
requirements to be met by each constituent in a given string and
define the semantic requirements which are required of these
constituents? When all of the branching phrase structure rules are
exhausted, we cannot make these lexical insertions at random, we
must replace a noun node with a word that the dictionary says is a
noun, a verb node with a verb, and etc. Strict sub categorization
rules are considering the following sets of sequences.Set Ago
should the not he and workBy however book this sad is ofSet B he
will lie the book on the tableThe girl seemed the pencilThey hit a
sadSet CJohn frightened the houseThe milk that he ate admired
themThe little boy is pregnantThere are sets ungrammatical, for
they all violate basic phrase structure rules. The string in Set a
native speaker of English would no doubt agree is the worst of the
lot. The explanation for their complete unacceptability is that
they violate the most basic of the Phrase Structure rules is the
branching rules. Except for the accident of recognizable English
words, these sequences cannot be even be called English. The
strings Set B are better at least we recognize something of English
syntax in them. These sequences violate the strict sub
categorization phrase structure rules which tell us that certain
verbs require complements of specific part of speech categories.A
noun phrase must follow a transitive verb an adjective must follow
a verb like seem, and etc. The early Transformational generative
grammar model made an effort to handle strict sub categorization
problems of this sort by identifying transitive, intransitive, and
linking verb types in the branching rules and indeed, those rules
would prevent such mishaps as are illustrated in the strings of set
B. This change makes for more accurate specifications, for it has
the added advantage of specifically identifying the particular
syntactic or contextual feature characteristics of each individual
lexical verb.When we discuss about the ungrammatical sequences of
Set C the contextual features specified the bank space represents
the place where the item possessing the feature must stand. The
strings in Set C above are ungrammatical for a third reason having
to do in this case semantic impropriety. They violate a second kind
of sub categorization phrase structure rule is a rule of lexical
selection within a given category. One does not frighten an animate
object like a house, one does not eat milk, one drinks it, only a
person can admire something but an animate object, a situation, an
idea cannot only a woman not a little boy can be pregnant. The
selection phrase structure rules it will be necessary to consult
the lexicon or dictionary of the grammar. Then armed with the
required information about a words meaning, we can specify its
selection features. it Is at this point in the development of a
grammar theory which will include feature specifications that the
linguist between those features which are syntactic ones and those
which are semantic.Incidentally by nothing contextual verb features
in this new way, we are able to simplify the branching rules. It is
no longer necessary to rewrite verb and Vt, Vi, or Vi.Eat [+V,
+___NP] (eat is a verb, eat must be followed by an NP)Walk [+V,
+___;;] (walk is a verb, walk requires no complement)Believe [+V,
{+___NP}] (believe is a verb, believe must be followed by either an
NP or {+___that-S}] a that sentence).
The LexiconThe lexical aspect of a verb is a part of the way in
which that verb is structured in relation to time. Any event,
state, process, or action which a verb expresses collectively, any
eventuality may also be said to have the same lexical aspect.
Lexical aspect is distinguished from grammatical aspect: lexical
aspect is an inherent property of a (semantic) eventuality, whereas
grammatical aspect is a property of a (syntactic or morphological)
realization. Lexical aspect is invariant, while grammatical aspect
can be changed according to the whims of the speaker.For example,
eat an apple differs from sit in that there is a natural endpoint
or conclusion to eating an apple. There is a time at which the
eating is finished, completed, or all done. By contrast, sitting
can merely stop: unless we add more details, it makes no sense to
say that someone "finished" sitting. This is a distinction of
lexical aspect between the two verbs. Verbs that have natural
endpoints are called "telic" (from Ancient Greek telos, end); those
without are called "atelic."Zeno Vendler (1957) classified verbs
into four categories: those that express "activity",
"accomplishment", "achievement" and "state". Activities and
accomplishments are distinguished from achievements and states in
that the former allow the use of continuous and progressive
aspects. Activities and accomplishments are distinguished from each
other by roundedness: activities do not have a terminal point (a
point before which the activity cannot be said to have taken place,
and after which the activity cannot continue for example "John drew
a circle") whereas accomplishments do. Of achievements and states,
achievements are instantaneous whereas states are durative.
Achievements and accomplishments are distinguished from one another
in that achievements take place immediately (such as in "recognize"
or "find") whereas accomplishments approach an endpoint
incrementally (as in "paint a picture" or "build a house").The base
of the grammar consists so far of branching rules, two types of sub
categorization rules are strict subcategory rules and selection
rules. Up to this point, the phrase structure rules have been able
to operate blindly. They are defined, therefore, as context free
rules.Features MatrixesThe lexicon therefore is a kind of
dictionary which lists by category all of the lexical words of a
particular language. Each word will be accompanied by a dictionary
definition plus two features matrixes. One feature matrix will list
phonological pronunciation features, and the second feature matrix
will list the semantic and syntactic features inherent in the basic
meaning of a word. Some of these semantic features are cross
classifications but others are hierarchically ordered. If for
instance we were to list the semantic features for the word man, we
could omit [+ animate] since [+human] implies animations.The
typical features matrix for an adjective would include to the
general category feature [+adj], any other inherent semantic or
syntactic distinctive features which a particular lexical adjective
automatically imposes on other related words in a string. The
adjective pregnant would have to be accompanied by feature
notations [+animate], [+adult], [+mase].An adverb must have a
feature matrix specifying such things as [+manner],
[+time],[+place], [+direction], [+condition], and etc.A pronoun, to
the general category specification [+noun], [+pro], must specify
whether the word is a personal pronoun [+person] a relative pronoun
[+Rel] or a demonstrative pronoun [+dem].A person pronoun must be
further characterized as first, second or third person: [+1],
[+11], [+111] respectively as singular or plural [+plural]
[-plural] and as being in the nominative [+nom], accusative
(objective) [+accus] or possessive [+poss] case.A relative pronoun
(who, which, that) must include the feature [+animate] for who,
[-animate] for which and nothing for that, which can be assumed in
the absence of a specific feature restriction to be acceptable in
either situation. The relative who, must also carry the
specification for case [+nom], [+accus], or [+poss] and the
relative which must carry in addition to [-animate], the further
feature specification [-human]. A demonstrative pronoun (this,
these, that, those) must have not only the feature specification
[+pro] but also [+plural] [-plural] and [+near] [-near].A
determiner must have the feature [+common] [-common] (proper nouns
are not preceded by a determiner), and if [+common] the additional
specification [+def] [-def] (a common noun can be either definite:
the boy or indefined: a boy, an apple). If it is [+def] then it
must be further specified as [+dem] [-dem] and if [+dem] it must be
characterized as [+plural] [-plural] and [+near][-near].The lexicon
is similar to but infinitely more detailed than a large dictionary.
On the following pages are some sample feature matrixes such as
might appear in an English lexicon and following the feature
matrixes you will find a set of phrase structure rules as they were
revised for the aspects grammar model.LexiconLexicon is the second
base component about aspect model. Lexicon is these entries include
only semantic-syntactic features matrixes and are therefore
incomplete. Phonological feature matrixes are omitted because of
space limitations. The reader should never the less be aware that
phonology has progressed to the stage, where phonological feature
matrixes are in fact moredetailed and accurate than the
semantic-syntactic matrixes here included. A short definition would
also be included in a complete lexical entry. There are kinds of
lexicon.NounsChicagodoctordogJane manmilk slacks+N +N+N +N+N +N
+N-Common +Common+Common -Common+Common +Common +Common+Concrete
+Concrete+Concrete +Concrete+Concrete +Concrete +Concrete-Count
+Count+Count +Count +Count -Count +Count-Animate +Human+Animate
+Human+Human -Animate -Animate-Plural +Adult -Human -Mase+Adult
+Fluid +Plural+Mase -PluralVerbs Admire
defyexpectfrighten+V+V+V+V+____NP+____NP+____[that-S]+___NP+
[+Human]+ [+Animate]___+___[+Animate]-Obj Del-Obj DelModalsMay will
would+M+M+M+___...o+___...o+___...oAdjectivesHonestpregnant+Adj+Adj[+Human]__[-Mase]___PronounsHe
I it
we+N+N+N+N+Pro+Pro+Pro+Pro+III+I+III+I+Mase-Plural-Mase+Plural-Plural-Accusative-Fem-Accusative-Accusative-PluralDeterminersAn
the that
those+Det+Det+Det+Det-Pro-Pro-Pro-Pro-Def+Def+Def+Def-Plural-Dem+Dem+Dem+__[+Common]+Plural-Plural+Plural+__[+Vowel]+__[+Common]-Near-Near+__[+Common]+__[+Common]Aspects
Phrase Structure RulesS(SM) NP + VPNP + (SM) S+VP(SM) S1 + and +
(SM) S2SM(NEG) (Q) (PAS)NP NNP1 + and + NP2NP + SVP Cop + PredV
(NP) (VP)VP1 + and + VP2Aux T (M) (Perfect)
(Progressive)PredNPAdjPlaceT PastPresent15
LexNV Cop Place M Adj
Preliminary Lexical InsertionLinguists have worked out this
system since the publication of aspects which omits mention of verb
tense endings, determiners, and other such lexical formatives from
the tree structure at this stage in the sentence generation
process. They contend that verb and noun segments of this sort are
syntactically inherent in the lexical word itself, and furthermore,
that this approach makes the grammar theory more abstract and more
reflective of universal linguistic requirements. Let us suppose
that we are in the process of generating the sentences, we have run
through all of the branching phrase structure rules, and we now
have a phrase marker whose bottom line contains nothing but dummy
symbols and a few formatives.Those men who lie must hate the
truth.
The next step in the derivation will be replacing each of these
dummy symbols with a feature matrix. Moreover the pattern of the
branching indicates that the embedded clause has to be a
restrictive relative clause, because its source is an embedded S
node which branches from an NP node rather than from an S node. The
constituent structure of the string also requires that the verb in
the embedded sentence structure be an intransitive verb (no
complement follows it) but that the main clause verb be transitive
(an NP follows it). We have no way of knowing at this time what the
inherent semantic or syntactic properties of the direct object noun
must be.The next step in our derivation is to replace each dummy
symbol with all of the feature specifications which are required by
the sub categorization phrase structure rules. We needed context
free branching rules only to determine feature specifications; it
has now become necessary for us to look around so to speak, to
discover the contextual restrictions imposed by this particular
string. The tree structure below substitutes a complex symbol for
each of these dummy symbols:
CHAPTER IIICONCLUSIONA. Revisions of Syntactic Structures
ModelAlthought at the first time of its occurence TG grammar model
is balked, but step by step it took hold. The thing which made some
linguists unhappy with the early form of transformational grammar
was its failure to deal satisfactorily with the problem of meaning
and meaning relationships. Many linguists came to feel that meaning
is basic to language competence, and that therefore the grammar
theory ought somehow to incorporate semantic considerations in the
phrase-structure part of the grammar.The idea of making revisions
through the early TG model was appealing, for what it would mean is
that all the abstract material contained in the deep structure (the
base) of the grammar would represent linguistic universals. Only
the transformation operations would give instructions for the
idiosyncractic forms of particular languages. Another advantage of
the revised model is that the linguitic preperty of recursiveness,
which the Syntactic Structures model had for the most part assigned
to the transformation rules would now be completely accounted for
in the base.B. Standard TG Grammar Theory: Aspects ModelThe aspects
TG grammar model has three major components: a syntax, a semantics,
and a phonology. From these three, syntax is the central.C. Aspects
Model: The Base Component, IThe base of the grammar consists so far
of branching rules, two types of sub categorization rules are
strict subcategory rules and selection rules. Branching rules is
context-free rules which act blindly to produce any one of a number
of strings terminating in category nodes. Strict subcategory rules
is which define the syntactic requirements to be met by each
constituent in a given string and selection rules is which define
the semantic requirements which are required of these constituents.
Up to this point, the phrase structure rules have been able to
operate blindly. They are defined, therefore, as context free
rules.However linguistics have worked out this system since the
publication of Aspects which omits mention of verb endings,
determiners and other such lexical formatives form the tree
structure at this stage in the sentence generation process. They
contend that verb and noun segments of this sort are syntactically
inherent in the lexical word itself and furthermore that this
approach makes the grammar theory more abstract and more reflective
of universal linguistic requirements.Thus, we are already involved
with context-sensitive rules as we will be from now on. As soon as
we have made our first lexical choice that choice will
automatically impose further restrictions on all the remaining
lexical selections. It is for this reason that we must now turn to
the transformational rules of the grammar.
REFERENCEShttp://en.wikipedia.org/wiki/Lexical_aspectwww.google.comhttp;/translate.google.comBinnick,
R. I. (1991) Time and the Verb: A Guide to Tense & Aspect.
Oxford: Oxford University Press.Chomsky, Noam. (1965) Aspects of
the Theory of Syntax.Cambridge: The M.I.T. Press.Chomsky, Noam.
(2002) Syntactic Structures. Berlin: Mouton de Gruyter.LaPalombara,
Lyda E. (1976) An Introduction to Grammar: Traditional, Structural,
Transformational. Cambridge: Winthrop Publishers.
21