edited by Stephan Oepen, Dan Flickinger, Jun-ichi Tsujii and Hans Uszkoreit
CSLI, 2001
Cloth: 978-1-57586-289-7 | eISBN: 978-1-57586-918-6 | Paper: 978-1-57586-290-3
Library of Congress Classification P98.C547 2002
Dewey Decimal Classification 410.285

ABOUT THIS BOOK | TOC
ABOUT THIS BOOK
Following high hopes and subsequent disillusionment in the late 1980s, the past decade of work in language engineering has seen a dramatic increase in the power and sophistication of statistical approaches to natural language processing, along with a growing recognition that these methods alone cannot meet the full range of demands for applications of NLP. While statistical methods, often described as 'shallow' processing techniques, can bring real advantages in robustness and efficiency, they do not provide the precise, reliable representations of meaning which more conventional symbolic grammars can supply for natural language. A consistent, fine-grained mapping between form and meaning is of critical importance in some NLP applications, including machine translation, speech prosthesis, and automated email response. Recent advances in grammar development and processing implementations offer hope of meeting these demands for precision.

This volume provides an update on the state of the art in the development and application of broad-coverage declarative grammars built on sound linguistic foundations - the 'deep' processing paradigm - and presents several aspects of an international research effort to produce comprehensive, re-usable grammars and efficient technology for parsing and generating with such grammars.

See other books on: Artificial Intelligence | Case Study | Computational linguistics | Flickinger, Dan | Oepen, Stephan
See other titles from CSLI