cover of book

Echo Objects: The Cognitive Work of Images
by Barbara Maria Stafford
University of Chicago Press, 2007
Paper: 978-0-226-77052-9 | Cloth: 978-0-226-77051-2
Library of Congress Classification BF311.S67715 2007
Dewey Decimal Classification 153.32

Barbara Maria Stafford is at the forefront of a growing movement that calls for the humanities to confront the brain’s material realities. In Echo Objects, she argues that humanists should seize upon the exciting neuroscientific discoveries that are illuminating the underpinnings of cultural objects. In turn, she contends, brain scientists could enrich their investigations of mental activity by incorporating phenomenological considerations—particularly the intricate ways that images focus intentional behavior and allow us to feel thought.
As a result, Echo Objects is a stunningly broad exploration of how complex images—or patterns that compress space and time—make visible the invisible ordering of human consciousness. Stafford demonstrates, for example, how the compound formats of emblems, symbols, collage, and electronic media reveal the brain’s grappling to construct mental objects that are redoubled by prior associations. In contrast, she shows that findings in evolutionary biology and the neurosciences are providing profound opportunities for understanding aesthetic conundrums such as the human urge to imitate and the role of narrative and nonnarrative representation.
 Ultimately, she makes an impassioned plea for a common purpose—for the acknowledgement that, at the most basic level, these separate projects belong to a single investigation.
“Heroic. . . . The larger message of Stafford’s intense, propulsive prose is unassailable. If we are to get much further in the great puzzle of ‘binding’—how the perception of an image, the will to act on intention, or the forging of consciousness is assembled from the tens of thousands of neurons firing at any one moment in time—then there needs to be action on all fronts.”—Science

Reference metadata exposed for Zotero via unAPI.