front cover of Algebras, Diagrams and Decisions in Language, Logic and Computation
Algebras, Diagrams and Decisions in Language, Logic and Computation
Edited by Kees Vermeulen and Ann Copestake
CSLI, 2002
This exemplary volume shows how the shared interests of three different research areas can lead to significant and fruitful exchanges: six papers each very accessibly present an exciting contribution to the study and uses of algebras, diagrams, and decisions, ranging from indispensable overview papers about shared formal members to inspirational applications of formal tools to specific problems. Contributors include Pieter Adriaans, Sergei Artemov, Steven Givant, Edward Keenan, Almerindo Ojeda, Patrick Scotto di Luzio, and Edward Stabler.
[more]

front cover of Ambiguity in Language Learning
Ambiguity in Language Learning
Computational and Cognitive Models
Hinrich Schütze
CSLI, 1997
This volume is concerned with how ambiguity and ambiguity resolution are learned, that is, with the acquisition of the different representations of ambiguous linguistic forms and the knowledge necessary for selecting among them in context. Schütze concentrates on how the acquisition of ambiguity is possible in principle and demonstrates that particular types of algorithms and learning architectures (such as unsupervised clustering and neural networks) can succeed at the task. Three types of lexical ambiguity are treated: ambiguity in syntactic categorisation, semantic categorisation, and verbal subcategorisation. The volume presents three different models of ambiguity acquisition: Tag Space, Word Space, and Subcat Learner, and addresses the importance of ambiguity in linguistic representation and its relevance for linguistic innateness.
[more]

front cover of Arabic Computational Linguistics
Arabic Computational Linguistics
Edited by Ali Farghaly
CSLI, 2010
Arabic is an exciting—yet challenging—language for scholars because many of its linguistic properties have not been fully described. Arabic Computational Linguistics documents the recent work of researchers in both academia and industry who have taken up the challenge of solving the real-life problems posed by an understudied language.

This comprehensive volume explores new Arabic machine translation systems, innovations in speech recognition and mention detection, tree banks, and linguistic corpora. Arabic Computational Linguistics will be an indispensable reference for language researchers and practitioners alike.
[more]

front cover of Attribute-Value Logic and the Theory of Grammar
Attribute-Value Logic and the Theory of Grammar
Mark Johnson
CSLI, 1988
Because of the ease of their implementations, attribute-value nased theorires of grammar are becoming increasingly populaar in theoretical linguistics as an alternative to transformational accounts, as well as in computational linguistics. Mark Johnson provides a formal analysis of attribute-value structures, of their use in a theory of grammar, of the representation of grammatical relations in such theories of grammar, and the implications of different representations. A classical treatment of disjunction and negation is alo included. "Essential reading for anyone interested in recent unification-based approcahes to grammar. Johnson lucidly lays out a formal framework in which a sharp distinction is drawn between descriptions of linguistic objects and the objects themselves. Negation and disjuntion over complex features, though linguistically desirable, have given rise to many problems, and one of Johnson's main achievements is to show that they can be interpreted using classic logic." -Ewan Klein, University of Edinburgh MARK JOHNSON is assitant professor of cognitive and linguistic sciences at Brown University.
[more]

front cover of Automaton Theories of Human Sentence Comprehension
Automaton Theories of Human Sentence Comprehension
John T. Hale
CSLI, 2014
By relating grammar to cognitive architecture, John T. Hale shows step-by-step how incremental parsing works in models of perceptual processing and how specific learning rules might lead to frequency-sensitive preferences. Along the way, Hale reconsiders garden-pathing, the parallel/serial distinction, and information-theoretical complexity metrics, such as surprisal. This book is a must for cognitive scientists of language.
[more]

front cover of Beyond Grammar
Beyond Grammar
An Experience-Based Theory of Language
Rens Bod
CSLI, 1998
During the last few years, a new approach to language processing has started to emerge, which has become known under the name of "Data Oriented Parsing" or "DOP". This approach embodies the assumption that human language comprehension and production works with representations of concrete past language experiences, rather than with abstract grammatical rules. The models that instantiate this approach therefore maintain corpora of linguistic representations of previously occurring utterances. New utterance-representations are constructed by freely combining partial structures from the corpus. A probability model is used to choose from the collection of different structures of different sizes those that make up the most appropriate representation of an utterance. In this book, DOP models for several kinds of linguistic representations are developed, ranging from tree representations, compositional semantic representations, attribute-value representations, and dialogue representations. These models are studied from a formal, linguistic and computational perspective and are tested with available language corpora. The main outcome of these tests suggests that the productive units of natural language cannot be defined in terms of a minimal set of rules (or constraints or principles), as is usually attempted in linguistic theory, but need to be defined in terms of a large, redundant set of previously experienced structures with virtually no restriction on their size and complexity. I will argue that this outcome has important consequences for linguistic theory, leading to a new notion of language competence. In particular, it means that the knowledge of a speaker/hearer cannot be understood as a grammar, but as a statistical ensemble of language experiences that changes slightly every time a new utterance is processed.
[more]

front cover of Collaborative Language Engineering
Collaborative Language Engineering
A Case Study in Efficient Grammar-Based Processing
Edited by Stephan Oepen, Dan Flickinger, Jun-ichi Tsujii, and Hans Uszkoreit
CSLI, 2001
Following high hopes and subsequent disillusionment in the late 1980s, the past decade of work in language engineering has seen a dramatic increase in the power and sophistication of statistical approaches to natural language processing, along with a growing recognition that these methods alone cannot meet the full range of demands for applications of NLP. While statistical methods, often described as 'shallow' processing techniques, can bring real advantages in robustness and efficiency, they do not provide the precise, reliable representations of meaning which more conventional symbolic grammars can supply for natural language. A consistent, fine-grained mapping between form and meaning is of critical importance in some NLP applications, including machine translation, speech prosthesis, and automated email response. Recent advances in grammar development and processing implementations offer hope of meeting these demands for precision.

This volume provides an update on the state of the art in the development and application of broad-coverage declarative grammars built on sound linguistic foundations - the 'deep' processing paradigm - and presents several aspects of an international research effort to produce comprehensive, re-usable grammars and efficient technology for parsing and generating with such grammars.
[more]

front cover of Collected Papers of Martin Kay
Collected Papers of Martin Kay
A Half Century of Computational Linguistics
Martin Kay, with the editorial assistance of Dan Flickinger and Stephan Oepen
CSLI, 2010
Since the dawn of the age of computers, researchers have been pushing the limits of available processing power to tackle the formidable challenge of developing software that can understand ordinary human language.  At the forefront of this quest for the past fifty years, Martin Kay has been a constant source of new algorithms which have proven fundamental to progress in computational linguistics. Collected Papers of Martin Kay, the first comprehensive collection of his works to date, opens a window into the growth of an increasingly important field of scientific research and development. 
 
 
[more]

front cover of Composition and Big Data
Composition and Big Data
Amanda Licastro, Benjamin Miller
University of Pittsburgh Press, 2021

In a data-driven world, anything can be data. As the techniques and scale of data analysis advance, the need for a response from rhetoric and composition grows ever more pronounced. It is increasingly possible to examine thousands of documents and peer-review comments, labor-hours, and citation networks in composition courses and beyond. Composition and Big Data brings together a range of scholars, teachers, and administrators already working with big-data methods and datasets to kickstart a collective reckoning with the role that algorithmic and computational approaches can, or should, play in research and teaching in the field. Their work takes place in various contexts, including programmatic assessment, first-year pedagogy, stylistics, and learning transfer across the curriculum. From ethical reflections to database design, from corpus linguistics to quantitative autoethnography, these chapters implement and interpret the drive toward data in diverse ways. 

[more]

front cover of A Computational Introduction to Linguistics
A Computational Introduction to Linguistics
Describing Language in Plain Prolog
Almerindo E. Ojeda
CSLI, 2014
In this book, Almerindo E. Ojeda offers a unique perspective on linguistics by discussing developing computer programs that will assign particular sounds to particular meanings and, conversely, particular meanings to particular sounds. Since these assignments are to operate efficiently over unbounded domains of sound and sense, they can begin to model the two fundamental modalities of human language—speaking and hearing. The computational approach adopted in this book is motivated by our struggle with one of the key problems of contemporary linguistics—figuring out how it is that language emerges from the brain.

[more]

front cover of Data-Oriented Parsing
Data-Oriented Parsing
Edited by Rens Bod, Remko Scha, and Khalil Sima'an
CSLI, 2003
Data-Oriented Parsing (DOP) is one of the leading paradigms in Statistical Natural Language Processing. In this volume, a collection of computational linguists offer a state-of-the-art overview of DOP, suitable for students and researchers in natural language processing and speech recognition as well as for computational linguistics.

This handbook begins with the theoretical background of DOP and introduces the algorithms used in DOP as well as in other probabilistic grammar models. After surveying extensions to the basic DOP model, the volume concludes with close study of the applications that use DOP as a backbone: speech understanding, machine translation, and language learning.
[more]

front cover of Finite-State Morphology
Finite-State Morphology
Kenneth R. Beesley and Lauri Karttunen
CSLI, 2003
The finite-state paradigm of computer science has provided a basis for natural-language applications that are efficient, elegant, and robust. This volume is a practical guide to finite-state theory and the affiliated programming languages lexc and xfst. Readers will learn how to write tokenizers, spelling checkers, and especially morphological analyzer/generators for words in English, French, Finnish, Hungarian, and other languages.

Included are graded introductions, examples, and exercises suitable for individual study as well as formal courses. These take advantage of widely-tested lexc and xfst applications that are just becoming available for noncommercial use via the Internet.
[more]

front cover of Flexible Semantics for Reinterpretation Phenomena
Flexible Semantics for Reinterpretation Phenomena
Markus Egg
CSLI, 2005
Deriving the correct meaning of such colloquial expressions as "I am parked out back" requires a unique interaction of knowledge about the world with a person's natural language tools. In this volume, Markus Egg examines how natural language rules and knowledge of the world work together to produce correct understandings of expressions that cannot be fully understood through literal reading. An in-depth and exciting work on semantics and natural language, this volume will be essential reading for scholars in computational linguistics.
[more]

front cover of A Grammar Writer's Cookbook
A Grammar Writer's Cookbook
Edited by Miriam Butt, Tracy Holloway King, María-Eugenia Niño, and Frédérique S
CSLI, 1999
A Grammar Writer's Cookbook is an introduction to the issues involved in the writing and design of computational grammars, reporting on experiences and analyses within the ParGram parallel grammar development project. Using the Lexical Functional Grammar (LFG) framework, this project implemented grammars for German, French, and English to cover parallel corpora.
[more]

logo for University of Chicago Press
Grammatical Competence and Parsing Performance
Bradley L. Pritchett
University of Chicago Press, 1992
How does a parser, a device that imposes an analysis on a string of symbols so that they can be interpreted, work? More specifically, how does the parser in the human cognitive mechanism operate? Using a wide range of empirical data concerning human natural language processing, Bradley Pritchett demonstrates that parsing performance depends on grammatical competence, not, as many have thought, on perception, computation, or semantics.

Pritchett critiques the major performance-based parsing models to argue that the principles of grammar drive the parser; the parser, furthermore, is the apparatus that tries to enforce the conditions of the grammar at every point in the processing of a sentence. In comparing garden path phenomena, those instances when the parser fails on the first reading of a sentence and must reanalyze it, with occasions when the parser successfully functions the first time around, Pritchett makes a convincing case for a grammar-derived parsing theory.
[more]

front cover of Grammatical Framework
Grammatical Framework
Programming with Multilingual Grammars
Aarne Ranta
CSLI, 2011

Grammatical Framework is a programming language designed for writing grammars, which has the capability of addressing several languages in parallel. This thorough introduction demonstrates how to write grammars in Grammatical Framework and use them in applications such as tourist phrasebooks, spoken dialogue systems, and natural language interfaces. The examples and exercises presented here address several languages, and the readers are shown how to look at their own languages from the computational perspective.

[more]

front cover of Handbook for Language Engineers
Handbook for Language Engineers
Edited by Ali Farghaly
CSLI, 2002
There is an overwhelming amount of language data on the Internet that needs to be searched, categorized, or processed—making the role of linguistics in the design of information systems a critical one. This book is a guide for linguists hoping to enter the language-processing field, as it assembles distinguished computational linguists from academia, research centers, and business to discuss how linguists can solve practical problems and improve business efficiency. Covering topics from speech recognition to web language resources, this collection will be of great value to both linguists entering the field and businesses hoping to implement linguistics-based solutions.
[more]

front cover of Implementing Typed Feature Structure Grammars
Implementing Typed Feature Structure Grammars
Ann Copestake
CSLI, 2001
Much of the work in modern formal linguistics is concerned with creating mathematically precise accounts of human languages—accounts that are particularly useful in research involving language processing with computers. Implementing Typed Feature Structure Grammars provides a student-level introduction to the most popular approach to this issue, and includes software that allows users to experiment with modeling different aspects of language.
[more]

front cover of Information Sharing
Information Sharing
Reference and Presupposition in Language Generation and Interpretation
Edited by Kees van Deemter and Rodger Kibble
CSLI, 2002
This book introduces the concept of information sharing as an area of cognitive science, defining it as the process by which speakers depend on "given" information to convey "new" information—an idea crucial to language engineering. Where previous work in information sharing was often fragmented between different disciplines, this volume brings together theoretical and applied work, and joins computational contributions with papers based on analyses of language corpora and on psycholinguistic experimentation.
[more]

front cover of Intelligent Linguistic Architectures
Intelligent Linguistic Architectures
Variations on Themes by Ronald M. Kaplan
Edited by Miriam Butt, Mary Dalrymple, and Tracy Holloway King
CSLI, 2006

Ronald M. Kaplan has made foundational contributions to the development of computational linguistic research and linguistic theory, particularly within Lexical-Functional Grammar. Intelligent Linguistic Architectures, a tribute to Kaplan’s cutting-edge work, collects computational and theoretical linguistics papers in his research areas. From machine translation to grammar engineering, from formal issues to semantic theory, this ambitious volume represents the newest developments in linguistic scholarship.

 

[more]

front cover of Jacy
Jacy
An Implemented Grammar of Japanese
Melanie Siegel, Emily M. Bender, and Francis Bond
CSLI, 2017
This book describes the fundamentals of Jacy, an implementation of a Japanese head-driven phrase structure grammar with many useful linguistic implications. Jacy presents sound information about the Japanese language (syntax, semantics, and pragmatics) based on implementation and tested on large quantities of data. As the grammar development was done in a multilingual environment, Jacy also showcases both multilingual concepts and differences among the languages and demonstrates the usefulness of semantic analysis in language technology applications.
[more]

front cover of Language and Grammar
Language and Grammar
Studies in Mathematical Linguistics and Natural Language
Edited by Claudia Casadio, Philip Scott, and Robert Seely
CSLI, 2004
The application of logic to grammar is a fundamental issue in philosophy and has been investigated by such renowned philosophers as Leibniz, Bolzano, Frege, and Husserl. Language and Grammar examines categorial grammars and type-logical grammars, two linguistic theories that play a significant role in this area of study yet have been overshadowed until recently. The prominent scholars contributing to this volume also explore the impact of the Lambek program on linguistics and logical grammar, producing, ultimately, an exciting and important resource that demonstrates how type-logical grammars are promising future models of reasoning and computation.
[more]

front cover of Linguistic Databases
Linguistic Databases
Edited by John Nerbonne
CSLI, 1998
Linguistics Databases explains the increasing use of databases in linguistics. Specifically, the works presented in this collection report on database activities in phonetics, phonology, lexicography and syntax, comparative grammar, second-language acquisition, linguistic fieldwork, and language pathology. The volume presents the specialized problems of multi-media (especially audio) and multilingual texts, including those in exotic writing systems. Various implemented solutions are discussed, and the opportunities to use existing, minimally structured text repositories are presented.
[more]

front cover of Linguistic Form and Its Computation
Linguistic Form and Its Computation
Edited by Christian Rohrer, Antje Rossdeutscher, and Hans Kamp
CSLI, 2001
This volume presents results of the Collaborative Research Center "Linguistic Foundations for Computational Linguistics" at the Universities of Stuttgart and Tubingen, whose goal has been to foster interaction between theoretical and computational linguistics. The papers here address topics including syntax, syntax-semantics interface, syntax-pragmatics interface, discourse, methods for lexicon induction, and the challenges of ambiguity.
[more]

front cover of Linguistics and Computation
Linguistics and Computation
Edited by Jennifer S. Cole, Georgia M. Green, and Jerry L. Morgan
CSLI, 1995
This volume is a collection covering the diverse areas of psycholinguistics, syntax, computational linguistics and phonology. Abney's paper on Chunks provides an interesting new approach to phrase structure, motivated by psycholinguist data, something that is rarely done. Berwick and Fong provide a history of computational implementations of (Chomskyan) Transformational Grammar. Cole's phonology paper, arguing from Chamorro and English stress that cyclicity is not needed in phonology, is also preceded by a one-and-a-half-page introduction on why this is relevant to computation. Coleman's contribution summarises work on computational phonology and describes the York Talk speech synthesis system. Hirschberg and Sproat's paper describes a system they have written to assign pitch accent to unrestricted text in an RT&T text-to-speech system. This is very much applied natural language processing, but their system represents a more thorough-going attempt at doing this well than has been previously attempted, and this appears to be the first write-up of this work. Johnson and Moss introduce Stratified Feature Grammar, a formal model of language, inspired by Relational Grammar but formalised by using and extending tools developed in the unification grammar community. Finally, Nakazawa extends further Tomita's work so that computer science LR parsing methods can be applied to natural language grammars, here feature-based grammars.
[more]

front cover of Logic, Langage and Computation, Volume 2
Logic, Langage and Computation, Volume 2
Edited by Lawrence S. Moss, Jonathan Ginzburg, and Maarten de Rijke
CSLI, 1999
The fields of logic, linguistics and computer science are intimately related, and modern research has uncovered a wide range of connections. This collection focuses on work that is based on the unifying concept of information. This collection of nineteen papers covers subjects such as channel theory, presupposition and constraints, the modeling of discourse, and belief. They were all presented at the 1996 Conference on Information-Theoretic Approaches to Logic, Language, Information, and Computation.
[more]

front cover of Logic, Language and Computation
Logic, Language and Computation
Edited by Jerry Seligman and Dag Westerståhl
CSLI, 1996
Subject: Linguistics; Logic; Computational Linguistics
[more]

front cover of Logic, Language and Computation, Volume 3
Logic, Language and Computation, Volume 3
Edited by Patrick Blackburn, Nick Braisby, Lawrence Cavedon, and Atsushi Shimoji
CSLI, 2001
With the rise of the internet and the proliferation of technology to gather and organize data, our era has been defined as "the information age." With the prominence of information as a research concept, there has arisen an increasing appreciation of the intertwined nature of fields such as logic, linguistics, and computer science that answer the questions about information and the ways it can be processed. The many research traditions do not agree about the exact nature of information. By bringing together ideas from diverse perspectives, this book presents the emerging consensus about what a conclusive theory of information should be. The book provides an introduction to the topic, work on the underlying ideas, and technical research that pins down the richer notions of information from a mathematical point of view.

The book contains contributions to a general theory of information, while also tackling specific problems from artificial intelligence, formal semantics, cognitive psychology, and the philosophy of mind. There is focus on the dynamics of information flow, and also a consideration of static approaches to information content; both quantitative and qualitative approaches are represented.
[more]

front cover of Measured Language
Measured Language
Quantitative Studies of Acquisition, Assessment, and Variation
Jeffrey Connor-Linton and Luke Wander Amoroso, Editors
Georgetown University Press, 2014

Measured Language: Quantitative Studies of Acquisition, Assessment, and Variation focuses on ways in which various aspects of language can be quantified and how measurement informs and advances our understanding of language. The metaphors and operationalizations of quantification serve as an important lingua franca for seemingly disparate areas of linguistic research, allowing methods and constructs to be translated from one area of linguistic investigation to another.

Measured Language includes forms of measurement and quantitative analysis current in diverse areas of linguistic research from language assessment to language change, from generative linguistics to experimental psycholinguistics, and from longitudinal studies to classroom research. Contributors demonstrate how to operationalize a construct, develop a reliable way to measure it, and finally validate that measurement—and share the relevance of their perspectives and findings to other areas of linguistic inquiry. The range and clarity of the research collected here ensures that even linguists who would not traditionally use quantitative methods will find this volume useful.

[more]

logo for Georgetown University Press
Mind and Context in Adult Second Language Acquisition
Methods, Theory, and Practice
Cristina Sanz, Editor
Georgetown University Press, 2005

How do people learn nonnative languages? Is there one part or function of our brains solely dedicated to language processing, or do we apply our general information-processing abilities when learning a new language? In this book, an interdisciplinary collaboration of scholars and researchers presents an overview of the latter approach to adult second language acquisition and brings together, for the first time, a comprehensive picture of the latest research on this subject.

Clearly organized into four distinct but integrated parts, Mind and Context in Adult Second Language Acquisition first provides an introduction to information-processing approaches and the tools for students to understand the data. The next sections explain factors that affect language learning, both internal (attention and awareness, individual differences, and the neural bases of language acquisition) and external (input, interaction, and pedagogical interventions). It concludes by looking at two pedagogical applications: processing instruction and content based instruction.

This important and timely volume is a must-read for students of language learning, second language acquisition, and linguists who want to better understand the information-processing approaches to learning a non-primary language. This book will also be of immense interest to language scholars, program directors, teachers, and administrators in both second language acquisition and cognitive psychology.

[more]

front cover of Open-Domain Question Answering from Large Text Collections
Open-Domain Question Answering from Large Text Collections
Marius Pasca
CSLI, 2003
Many books have indexes, but most textual media have none. Newspapers, legal transcripts, conference proceedings, correspondence, video subtitles, and web pages are increasingly accessible with computers, yet are still without indexes or other sophisticated means of finding the excerpts most relevant to a reader's question.

Better than an index, and much better than a keyword search, are the high-precision computerized question-answering systems explored in this book. Marius Pasca presents novel and robust methods for capturing the semantics of natural language questions and for finding the most relevant portions of texts. This research has led to a fully implemented and rigorously evaluated architecture that has produced experimental results showing great promise for the future of internet search technology.
[more]

front cover of Predicative Constructions
Predicative Constructions
From the Fregean to a Montagovian Treatment
Frank Van Eynde
CSLI, 2015
There are multitudes of ways in which predicative constructions can be analyzed. In this book, Frank Van Eynde differentiates between the Fregean and Montagovian treatments of these constructions in order to better understand predicative constructions as a grammatical model. Although he focuses his arguments on English and Dutch, Van Eynde also includes analyses of other Indo-European and non-Indo-European languages in order to better explore phenomena that do not occur in the two primary languages of his study.
[more]

front cover of Probabilistic Approaches to Linguistic Theory
Probabilistic Approaches to Linguistic Theory
Edited by Jean-Philippe Bernardy, Rasmus Blanck, Stergios Chatzikyriakidis, Shalom Lappin, and Aleksandre Maskharashvili
CSLI, 2022
A textbook exploring predictive modes of linguistic development and analysis.

During the last two decades, computational linguists, in concert with other researchers in AI, have turned to machine learning and statistical techniques to capture features of natural language and aspects of the learning process that are not easily accommodated in classical algebraic frameworks. These developments are producing a revolution in linguistics in which traditional symbolic systems are giving way to probabilistic and deep learning approaches. This collection features articles that provide background to these approaches, and their application in syntax, semantics, pragmatics, morphology, psycholinguistics, neurolinguistics, and dialogue modeling. Each chapter provides a self-contained introduction to the topic that it covers, making this volume accessible to graduate students and researchers in linguistics, NLP, AI, and cognitive science.
 
[more]

front cover of Putting Linguistics into Speech Recognition
Putting Linguistics into Speech Recognition
The Regulus Grammar Compiler
Manny Rayner, Beth Ann Hockey, and Pierrette Bouillon
CSLI, 2006
Most computer programs that analyze spoken dialogue use a spoken command grammar, which limits what the user can say when talking to the system. To make this process simpler, more automated, and effective for command grammars even at initial stages of a project, the Regulus grammar compiler was developed by a consortium of experts—including NASA scientists. This book presents a complete description of both the practical and theoretical aspects of Regulus and will be extremely helpful for students and scholars working in computational linguistics as well as software engineering.
[more]

front cover of Semantic Ambiguity and Underspecification
Semantic Ambiguity and Underspecification
Edited by Kees van Deemter and Stanley Peters
CSLI, 1996
Kees van Deemter and Stanley Peters Subject: Linguistics; Semantics; Ambiguity
[more]

front cover of Shaping Phonology
Shaping Phonology
Edited by Diane Brentari and Jackson L. Lee
University of Chicago Press, 2018
Within the past forty years, the field of phonology—a branch of linguistics that explores both the sound structures of spoken language and the analogous phonemes of sign language, as well as how these features of language are used to convey meaning—has undergone several important shifts in theory that are now part of standard practice. Drawing together contributors from a diverse array of subfields within the discipline, and honoring the pioneering work of linguist John Goldsmith, this book reflects on these shifting dynamics and their implications for future phonological work.

Divided into two parts, Shaping Phonology first explores the elaboration of abstract domains (or units of analysis) that fall under the purview of phonology. These chapters reveal the increasing multidimensionality of phonological representation through such analytical approaches as autosegmental phonology and feature geometry. The second part looks at how the advent of machine learning and computational technologies has allowed for the analysis of larger and larger phonological data sets, prompting a shift from using key examples to demonstrate that a particular generalization is universal to striving for statistical generalizations across large corpora of relevant data. Now fundamental components of the phonologist’s tool kit, these two shifts have inspired a rethinking of just what it means to do linguistics.
[more]

front cover of The Structure of Scientific Articles
The Structure of Scientific Articles
Applications to Citation Indexing and Summarization
Simone Teufel
CSLI, 2010
Finding a particular scientific document amidst a sea of thousands of other documents can often seem like an insurmountable task. The Structure of Scientific Articles shows how linguistic theory can provide a solution by analyzing rhetorical structures to make information retrieval easier and faster.
Through the use of an improved citation indexing system, this indispensable volume applies empirical discourse studies to pressing issues of document management, including attribution, the author’s stance towards other work, and problem-solving processes.
[more]

front cover of The Tbilisi Symposium on Logic, Language and Computation
The Tbilisi Symposium on Logic, Language and Computation
Selected Papers
Edited by Jonathan Ginzburg, Zurab Khasidashvili, Carl Vogel, Jean-Jacques Lévy,
CSLI, 1998
This volume brings together papers from linguists, logicians, and computer scientists from thirteen countries (Armenia, Denmark, France, Georgia, Germany, Israel, Italy, Japan, Poland, Spain, Sweden, UK, and USA). This collection aims to serve as a catalyst for new interdisciplinary developments in language, logic and computation and to introduce new ideas from the expanded European academic community. Spanning a wide range of disciplines, the papers cover such topics as formal semantics of natural language, dynamic semantics, channel theory, formal syntax of natural language, formal language theory, corpus-based methods in computational linguistics, computational semantics, syntactic and semantic aspects of l-calculus, non-classical logics, and a fundamental problem in predicate logic.
[more]

front cover of Where's the Rhetoric?
Where's the Rhetoric?
Imagining a Unified Field
S. Scott Graham
The Ohio State University Press, 2020
The emergence of rhetorical new materialisms and computational rhetorics has provoked something of an existential crisis within rhetorical studies. In Where’s the Rhetoric?, S. Scott Graham tackles this titular question by arguing first that scholarly efforts in rhetorical new materialisms and computational rhetoric be understood as coextensive with longstanding disciplinary commitments in rhetoric. In making this argument, Graham excavates the shared intellectual history of traditional rhetorical inquiry, rhetorical new materialisms, and computational rhetoric with particular emphasis on the works of Carolyn Miller, Kenneth Burke, and Henri Bergson.
 
Building on this foundation, Graham then argues for a more unified approach to contemporary rhetorical inquiry—one that eschews disciplinary demarcations between rhetoric’s various subareas. Specifically, Graham uses his unified field theory to explore 1) the rise of the “tweetorial” as a parascientific genre, 2) inventional practices in new media design, 3) statistical approaches to understanding biomedical discourse, and 4) American electioneering rhetorics. The book overall demonstrates how seemingly disparate intellectual approaches within rhetoric can be made to speak productively to one another in the pursuit of shared scholarly goals around questions of genre, media, and political discourse—thereby providing a foundation for imagining a more unified field.
 
[more]


Send via email Share on Facebook Share on Twitter