front cover of Articulating Reasons
Articulating Reasons
An Introduction to Inferentialism
Robert B. Brandom
Harvard University Press, 2001

Robert B. Brandom is one of the most original philosophers of our day, whose book Making It Explicit covered and extended a vast range of topics in metaphysics, epistemology, and philosophy of language--the very core of analytic philosophy. This new work provides an approachable introduction to the complex system that Making It Explicit mapped out. A tour of the earlier book's large ideas and relevant details, Articulating Reasons offers an easy entry into two of the main themes of Brandom's work: the idea that the semantic content of a sentence is determined by the norms governing inferences to and from it, and the idea that the distinctive function of logical vocabulary is to let us make our tacit inferential commitments explicit.

Brandom's work, making the move from representationalism to inferentialism, constitutes a near-Copernican shift in the philosophy of language--and the most important single development in the field in recent decades. Articulating Reasons puts this accomplishment within reach of nonphilosophers who want to understand the state of the foundations of semantics.

[more]

front cover of Cognitive Harmony
Cognitive Harmony
The Role of Systemic Harmony in the Constitution of Knowledge
Nicholas Rescher
University of Pittsburgh Press, 2005

This novel approach to epistemological discourse explains the complex but crucial role that systematization plays-not just for the organization of what we know, but also for its validation. Cognitive Harmony argues for a new conception of the process philosophers generally call induction.

Relying on the root definition of harmony, a coherent unification of component parts (systemic integrity) in such a way that the final object can successfully accomplish what it was meant to do (evaluative positivity), Rescher discusses the role of harmony in cognitive contexts, the history of cognitive harmony, and the various features it has in producing human knowledge. The book ends on the issue of philosophy and the sort of harmony required of philosophical systems.

[more]

logo for Harvard University Press
Culture and Inference
A Trobriand Case Study
Edwin Hutchins
Harvard University Press, 1980

This book takes a major step in psychological anthropology by applying new analytic tools from cognitive science to one of the oldest and most vexing anthropological problems: the nature of "primitive" thought.

For a decade or more there has been broad agreement within anthropology that culture might be usefully viewed as a system of tacit rules that constrain the meaningful interpretation of events and serve as a guide to action. However, no one has made a serious attempt to write a cultural grammar that would make such rules explicit. In Culture and Inference Edwin Hutchins makes just such an attempt for one enormously instructive case, the Trobriand Islanders' system of land tenure.

Using the propositional network notation developed by Rumeihart and Norman, Hutchins describes native knowledge about land tenure as a set of twelve propositions. Inferences are derived from these propositions by a set of transfer formulas that govern the way in which static knowledge about land tenure can be applied to new disputes. After deriving this descriptive system by extensive observation of the Trobrianders' land courts and by interrogation of litigants, Hutchins provides a test of his grammar by showing how it can be used to simulate decisions in new cases.

What is most interesting about these simulations, generally, is that theyrequire all the same logical operations that arise from a careful analysis of Western thought. Looking closely at "primitive" inference in a natural situation, Hutchins finds that Trobriand reasoning is no more primitive than our own.

[more]

logo for University of Minnesota Press
Epistemology and Inference
Henry E. Kyburg, Jr.
University of Minnesota Press, 1983

Epistemology and Inference was first published in 1983. Minnesota Archive Editions uses digital technology to make long-unavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.

Henry Kyburg has developed an original and important perspective on probabilistic and statistical inference. Unlike much contemporary writing by philosophers on these topics, Kyburg's work is informed by issues that have arisen in statistical theory and practice as well as issues familiar to professional philosophers. In two major books and many articles, Kyberg has elaborated his technical proposals and explained their ramifications for epistemology, decision-making, and scientific inquiry. In this collection of published and unpublished essays, Kyburg presents his novel ideas and their applications in a manner that makes them accessible to philosophers and provides specialists in probability and induction with a concise exposition of his system.

[more]

front cover of Evidence and Inference in History and Law
Evidence and Inference in History and Law
Interdisciplinary Dialogues
William Twining and Ian Hampsher-Monk
Northwestern University Press, 2003
Northwestern University Press co-published William Twining's Rethinking Evidence in 1994 and Analysis of Evidence in 1998. This new volume, Evidence and Inference, is an interdisciplinary volume exploring the application of techniques of evidence and inference across a variety of fields.

Coedited by Twining, one of the world's outstanding evidence scholars, and Iain Hampsher-Monk, a leading political theorist, the volume considers intriguing questions from Assyriology, theater iconography, musicology, criminology, the history of ideas, and colonial history as it reveals how particular concepts, lines of questioning, and techniques of reasoning and analysis developed in one context can be fruitfully applied in others. Did cuneiform languages really die out in the second or third century B.C.? Was Schubert responsible for any of the guitar arrangements for some of his lieder? In these cases and others, the authors' work demonstrates that, regardless of the field or the problem, all such projects involve drawing inferences from evidence, and that the logic of this kind of inquiry is always governed by the same principles.
[more]

front cover of Inference and Disputed Authorship
Inference and Disputed Authorship
Frederick Mosteller and David L. Wallace
CSLI, 2013
The 1964 publication of Inference and Disputed Authorship made the cover of Time magazine and the attention of academics and the public alike for its use of statistical methodology to solve one of American history’s most notorious questions: the disputed authorship of the Federalist Papers.
 
Back in print for a new generation of readers, this classic volume applies mathematics, including the once-controversial Bayesian analysis, into the heart of a literary and historical problem by studying frequently used words in the texts. The reissue of this landmark book will be welcomed by anyone interested in the juncture of history, political science, and authorship.
[more]

front cover of Inference and Representation
Inference and Representation
A Study in Modeling Science
Mauricio Suárez
University of Chicago Press, 2024
The first comprehensive defense of an inferential conception of scientific representation with applications to art and epistemology.
 
Mauricio Suárez develops a conception of representation that delivers a compelling account of modeling practice. He begins by discussing the history and methodology of model building, charting the emergence of what he calls the modeling attitude, a nineteenth-century and fin de siècle development. Prominent cases of models, both historical and contemporary, are used as benchmarks for the accounts of representation considered throughout the book. After arguing against reductive naturalist theories of scientific representation, Suárez sets out his own account: a case for pluralism regarding the means of representation and minimalism regarding its constituents. He shows that scientists employ a variety of modeling relations in their representational practice—which helps them to assess the accuracy of their representations—while demonstrating that there is nothing metaphysically deep about the constituent relation that encompasses all these diverse means.
 
The book also probes the broad implications of Suárez’s inferential conception outside scientific modeling itself, covering analogies with debates about artistic representation and philosophical thought over the past several decades.
[more]

front cover of Making Meaning
Making Meaning
Inference and Rhetoric in the Interpretation of Cinema
David Bordwell
Harvard University Press, 1989

David Bordwell’s new book is at once a history of film criticism, an analysis of how critics interpret film, and a proposal for an alternative program for film studies. It is an anatomy of film criticism meant to reset the agenda for film scholarship. As such Making Meaning should be a landmark book, a focus for debate from which future film study will evolve.

Bordwell systematically maps different strategies for interpreting films and making meaning, illustrating his points with a vast array of examples from Western film criticism. Following an introductory chapter that sets out the terms and scope of the argument, Bordwell goes on to show how critical institutions constrain and contain the very practices they promote, and how the interpretation of texts has become a central preoccupation of the humanities. He gives lucid accounts of the development of film criticism in France, Britain, and the United States since World War II; analyzes this development through two important types of criticism, thematic-explicatory and symptomatic; and shows that both types, usually seen as antithetical, in fact have much in common. These diverse and even warring schools of criticism share conventional, rhetorical, and problem-solving techniques—a point that has broad-ranging implications for the way critics practice their art. The book concludes with a survey of the alternatives to criticism based on interpretation and, finally, with the proposal that a historical poetics of cinema offers the most fruitful framework for film analysis.

[more]

logo for Harvard University Press
Mental Models
Towards a Cognitive Science of Language, Inference, and Consciousness
Philip Johnson-Laird
Harvard University Press, 1983

Mental Models offers nothing less than a unified theory of the major properties of mind: comprehension, inference, and consciousness. In spirited and graceful prose, Johnson-Laird argues that we apprehend the world by building inner mental replicas of the relations among objects and events that concern us. The mind is essentially a model-building device that can itself be modeled on a digital computer. This book provides both a blueprint for building such a model and numerous important illustrations of how to do it.

In several key areas of cognition, Johnson-Laird shows how an explanation based on mental modeling is clearly superior to previous theory. For example, he argues compellingly that deductive reasoning does not take place by tacitly applying the rules of logic, but by mentally manipulating models of the states of affairs from which inferences are drawn. Similarly, linguistic comprehension is best understood not as a matter of applying inference rules to propositions derived from sentences, but rather as the mind's effort to construct and update a model of the situation described by a text or a discourse. Most provocative, perhaps, is Johnson-Laird's theory of consciousness: the mind's necessarily incomplete model of itself allows only a partial control over the many unconscious and parallel processes of cognition.

This an extraordinarily rich book, providing a coherent account of much recent experimental work in cognitive psychology, along with lucid explanations of relevant theory in linguistics, computer science, and philosophy Not since Miller, Galanter, and Pribram's classic Plans and the Structure of Behavior has a book in cognitive science combined such sweep, style, and good sense. Like its distinguished predecessor, Mental Models may well serve to fix a point of view for a generation.

[more]

front cover of The Myth of Artificial Intelligence
The Myth of Artificial Intelligence
Why Computers Can’t Think the Way We Do
Erik J. Larson
Harvard University Press, 2021

“Exposes the vast gap between the actual science underlying AI and the dramatic claims being made for it.”
—John Horgan


“If you want to know about AI, read this book…It shows how a supposedly futuristic reverence for Artificial Intelligence retards progress when it denigrates our most irreplaceable resource for any future progress: our own human intelligence.”
—Peter Thiel

Ever since Alan Turing, AI enthusiasts have equated artificial intelligence with human intelligence. A computer scientist working at the forefront of natural language processing, Erik Larson takes us on a tour of the landscape of AI to reveal why this is a profound mistake.

AI works on inductive reasoning, crunching data sets to predict outcomes. But humans don’t correlate data sets. We make conjectures, informed by context and experience. And we haven’t a clue how to program that kind of intuitive reasoning, which lies at the heart of common sense. Futurists insist AI will soon eclipse the capacities of the most gifted mind, but Larson shows how far we are from superintelligence—and what it would take to get there.

“Larson worries that we’re making two mistakes at once, defining human intelligence down while overestimating what AI is likely to achieve…Another concern is learned passivity: our tendency to assume that AI will solve problems and our failure, as a result, to cultivate human ingenuity.”
—David A. Shaywitz, Wall Street Journal

“A convincing case that artificial general intelligence—machine-based intelligence that matches our own—is beyond the capacity of algorithmic machine learning because there is a mismatch between how humans and machines know what they know.”
—Sue Halpern, New York Review of Books

[more]

front cover of Observation and Experiment
Observation and Experiment
An Introduction to Causal Inference
Paul R. Rosenbaum
Harvard University Press, 2017

A daily glass of wine prolongs life—yet alcohol can cause life-threatening cancer. Some say raising the minimum wage will decrease inequality while others say it increases unemployment. Scientists once confidently claimed that hormone replacement therapy reduced the risk of heart disease but now they equally confidently claim it raises that risk. What should we make of this endless barrage of conflicting claims?

Observation and Experiment is an introduction to causal inference by one of the field’s leading scholars. An award-winning professor at Wharton, Paul Rosenbaum explains key concepts and methods through lively examples that make abstract principles accessible. He draws his examples from clinical medicine, economics, public health, epidemiology, clinical psychology, and psychiatry to explain how randomized control trials are conceived and designed, how they differ from observational studies, and what techniques are available to mitigate their bias.

“Carefully and precisely written…reflecting superb statistical understanding, all communicated with the skill of a master teacher.”
—Stephen M. Stigler, author of The Seven Pillars of Statistical Wisdom

“An excellent introduction…Well-written and thoughtful…from one of causal inference’s noted experts.”
Journal of the American Statistical Association

“Rosenbaum is a gifted expositor…an outstanding introduction to the topic for anyone who is interested in understanding the basic ideas and approaches to causal inference.”
Psychometrika

“A very valuable contribution…Highly recommended.”
International Statistical Review

[more]

front cover of Tychomancy
Tychomancy
Inferring Probability from Causal Structure
Michael Strevens
Harvard University Press, 2013

Tychomancy—meaning “the divination of chances”—presents a set of rules for inferring the physical probabilities of outcomes from the causal or dynamic properties of the systems that produce them. Probabilities revealed by the rules are wide-ranging: they include the probability of getting a 5 on a die roll, the probability distributions found in statistical physics, and the probabilities that underlie many prima facie judgments about fitness in evolutionary biology.

Michael Strevens makes three claims about the rules. First, they are reliable. Second, they are known, though not fully consciously, to all human beings: they constitute a key part of the physical intuition that allows us to navigate around the world safely in the absence of formal scientific knowledge. Third, they have played a crucial but unrecognized role in several major scientific innovations.

A large part of Tychomancy is devoted to this historical role for probability inference rules. Strevens first analyzes James Clerk Maxwell’s extraordinary, apparently a priori, deduction of the molecular velocity distribution in gases, which launched statistical physics. Maxwell did not derive his distribution from logic alone, Strevens proposes, but rather from probabilistic knowledge common to all human beings, even infants as young as six months old. Strevens then turns to Darwin’s theory of natural selection, the statistics of measurement, and the creation of models of complex systems, contending in each case that these elements of science could not have emerged when or how they did without the ability to “eyeball” the values of physical probabilities.

[more]


Send via email Share on Facebook Share on Twitter