did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9781575861500

Beyond Grammar

by
  • ISBN13:

    9781575861500

  • ISBN10:

    157586150X

  • Format: Paperback
  • Copyright: 1998-06-01
  • Publisher: Stanford Univ Center for the Study
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $25.00

Summary

During the last few years, a new approach to linguistic analysis has started to emerge. This approach, which has come to be known under various labels such as 'data-oriented parsing', 'corpus-based interpretation' and 'treebank grammar', assumes that human language comprehension and production works with representations of concrete past language experiences rather than with abstract grammatical rules. It operates by decomposing the given representations into fragments and recomposing those pieces to analyze (infinitely many) new utterances. This book shows how this general approach can apply to various kinds of linguistic representations. Experiments with this approach suggest that the productive units of natural language cannot be defined by a minimal set of rules or principles, but need to be defined by a large, redundant set of previously experienced structures. Bod argues that this outcome has important consequences for linguistic theory, leading to an entirely new view of the nature of linguistic competence.

Table of Contents

Preface xi
Introduction: what are the productive units of natural language?
1(11)
A probabilistic approach to language
2(2)
Stochastic grammars and the problem of productive unit size
4(1)
The Data-Oriented Parsing framework: productivity from examples
5(2)
Language comprehension vs. language production
7(2)
Evaluation of DOP models
9(1)
Overview of this book
10(2)
An experience-based model for phrase-structure representations
12(12)
Representations
12(1)
Fragments
13(1)
Composition operations
14(2)
Probability calculation
16(8)
Formal Stochastic Language Theory
24(16)
A formal language theory of stochastic grammars
24(2)
DOPI as a Stochastic Tree-Substitution Grammar
26(1)
A comparison between Stochastic Tree-Substitution Grammar and Stochastic Context-Free Grammar
27(6)
Other stochastic grammars
33(5)
Stochastic History-Based Grammar (SHBG)
33(3)
Stochastic Lexicalized Tree-Adjoining Grammar (SLTAG)
36(1)
Other stochastic lexicalized grammars
37(1)
Open questions
38(2)
Parsing and disambiguation
40(11)
Parsing
40(3)
Disambiguation
43(8)
Viterbi optimization is not applicable to finding the most probable parse
43(2)
Monte Carlo disambiguation: estimating the most probable parse by sampling random derivations
45(4)
Cognitive aspects of Monte Carlo disambiguation
49(2)
Testing the model: can we restrict the productive units?
51(18)
The test environment
51(2)
The base line
53(1)
The impact of overlapping fragments
53(1)
The impact of fragment size
54(1)
The impact of fragment lexicalization
55(2)
The impact of fragment frequency
57(1)
The impact of non-head words
58(6)
Overview of the derived properties and discussion
64(5)
Learning new words
69(12)
The model DOP2
69(1)
Experiments with DOP2
70(2)
Evaluation: what goes wrong?
72(7)
The problem of unknown-category words
79(2)
Learning new structures
81(14)
The problem of unknown structures
81(3)
Good-Turing: estimating the population frequencies of (un)seen types
84(1)
Using Good-Turing to adjust the frequencies of subtrees
85(3)
The model DOP3
88(1)
Cognitive aspects of DOP3
88(2)
Experiments with DOP3
90(5)
An experience-based model for compositional semantic representations
95(17)
Incorporating semantic interpretation
95(14)
Assuming surface compositionality
96(8)
Not assuming surface compositionality: partial annotations
104(3)
The probability model of semantic DOP
107(2)
Extending DOP to discourse and recency
109(3)
Speech understanding and dialogue processing
112(14)
The OVIS corpus: trees enriched with compositional frame semantics
112(4)
Using the OVIS corpus for data-oriented semantic analysis
116(2)
Extending DOP to dialogue context: context-dependent subcorpora
118(1)
Interfacing DOP with speech
119(3)
Experiments
122(4)
Experience-based models for non-context-free representations
126(18)
A DOP model for Lexical-Functional representations
126(12)
Representations
126(2)
Fragments
128(3)
The composition operation
131(3)
Probability models
134(4)
Illustration and properties of LFG-DOP
138(6)
Conclusion: linguistics revisited 144(2)
References 146(17)
Index 163

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program