Advances in Modal Logic, Volume 1
Edited by Marcus Kracht, Maarten de Rijke, Heinrich Wansing, and Michael Zakhary CSLI, 1998 Library of Congress BC199.M6A38 1998 | Dewey Decimal 160
Modal logic originated in philosophy as the logic of necessity and possibility. Nowadays it has reached a high level of mathematical sophistication and found many applications in a variety of disciplines, including theoretical and applied computer science, artificial intelligence, the foundations of mathematics, and natural language syntax and semantics.
This volume represents the proceedings of the first international workshop on Advances in Modal Logic, held in Berlin, Germany, October 8-10, 1996. It offers an up-to-date perspective on the field, with contributions covering its proof theory, its applications in knowledge representation, computing and mathematics, as well as its theoretical underpinnings.
"This collection is a useful resource for anyone working in modal logic. It contains both interesting surveys and cutting-edge technical results"
--Edwin D. Mares
The Bulletin of Symbolic Logic, March 2002
CAA is the foremost conference on digital archaeology, and this volume offers a comprehensive and up-to date reference to the state of the art. This volume contains a selection of the best papers presented at the 40th Annual Conference of Computer Applications and Quantitative Methods in Archaeology (CAA), held in Southampton from 26 to 29 March 2012. The papers, all written and peer-reviewed by experts in the field of digital archaeology, explore a multitude of topics to showcase ground-breaking technologies and best practice from various archaeological and informatics disciplines, with a variety of case studies from all over the world.Download the Table of Contents and a sample chapter
Blockchain and machine learning technologies can mitigate healthcare issues such as slow access to medical data, poor system interoperability, lack of patient agency, and data quality and quantity for medical research. Blockchain technology facilitates and secures the storage of information in such a way that doctors can see a patient's entire medical history, but researchers see only statistical data instead of any personal information. Machine learning can make use of this data to notice patterns and give accurate predictions, providing more support for the patients and also in research related fields where there is a need for accurate data to predict credible results.
Since the dawn of the age of computers, researchers have been pushing the limits of available processing power to tackle the formidable challenge of developing software that can understand ordinary human language. At the forefront of this quest for the past fifty years, Martin Kay has been a constant source of new algorithms which have proven fundamental to progress in computational linguistics. Collected Papers of Martin Kay, the first comprehensive collection of his works to date, opens a window into the growth of an increasingly important field of scientific research and development.
Donald E. Knuth CSLI, 1998 Library of Congress Z249.3.K59 1999 | Dewey Decimal 686.22544536
In this collection, the second in the series, Knuth explores the relationship between computers and typography. The present volume, in the words of the author, is a legacy to all the work he has done on typography. When he thought he would take a few years' leave from his main work on the art of computer programming, as is well known, the short typographic detour lasted more than a decade. When type designers, punch cutters, typographers, book historians, and scholars visited the University during this period, it gave to Stanford what some consider to be its golden age of digital typography. By the author's own admission, the present work is one of the most difficult books that he has prepared. This is truly a work that only Knuth himself could have produced.
Can computer science solve our social problems? With Discourses on Social Software Jan Van Eijck and Rineke Verbrugge suggest it can, offering the reader a fascinating introduction to the innovative field of social software. Compiling a series of discussions involving a logician, a computer scientist, a philosopher, and a number of researchers from various other academic fields, this collection details the many ways in which the seemingly abstract disciplines of logic and computer science can be used to analyze and solve contemporary social problems.
The theory of computation is used to address challenges arising in many computer science areas such as artificial intelligence, language processors, compiler writing, information and coding systems, programming language design, computer architecture and more. To grasp topics concerning this theory readers need to familiarize themselves with its computational and language models, based on concepts of discrete mathematics including sets, relations, functions, graphs and logic.
In this age of DNA computers and artificial intelligence, information is becoming disembodied even as the "bodies" that once carried it vanish into virtuality. While some marvel at these changes, envisioning consciousness downloaded into a computer or humans "beamed" Star Trek-style, others view them with horror, seeing monsters brooding in the machines. In How We Became Posthuman, N. Katherine Hayles separates hype from fact, investigating the fate of embodiment in an information age.
Hayles relates three interwoven stories: how information lost its body, that is, how it came to be conceptualized as an entity separate from the material forms that carry it; the cultural and technological construction of the cyborg; and the dismantling of the liberal humanist "subject" in cybernetic discourse, along with the emergence of the "posthuman."
Ranging widely across the history of technology, cultural studies, and literary criticism, Hayles shows what had to be erased, forgotten, and elided to conceive of information as a disembodied entity. Thus she moves from the post-World War II Macy Conferences on cybernetics to the 1952 novel Limbo by cybernetics aficionado Bernard Wolfe; from the concept of self-making to Philip K. Dick's literary explorations of hallucination and reality; and from artificial life to postmodern novels exploring the implications of seeing humans as cybernetic systems.
Although becoming posthuman can be nightmarish, Hayles shows how it can also be liberating. From the birth of cybernetics to artificial life, How We Became Posthuman provides an indispensable account of how we arrived in our virtual age, and of where we might go from here.
The rapid advancements in telecommunications, computing hardware and software, and data encryption, and the widespread use of electronic data processing and electronic business conducted through the Internet have led to a strong increase in information security threats. The latest advances in information security have increased practical deployments and scalability across a wide range of applications to better secure and protect our information systems and the information stored, processed and transmitted. This book outlines key emerging trends in information security from the foundations and technologies in biometrics, cybersecurity, and big data security to applications in hardware and embedded systems security, computer forensics, the Internet of Things security, and network security.
The interviews in this volume form the nearest thing possible to an autobiography of eminent computer scientist Donald E. Knuth. Based on the English-language Companion to the Papers of Donald Knuth, also published by CSLI Publications, this book brings the highlights of that material to a Francophone audience.
Computerized processes are everywhere in our society. They are the automated phone messaging systems that businesses use to screen calls; the link between student standardized test scores and public schools’ access to resources; the algorithms that regulate patient diagnoses and reimbursements to doctors. The storage, sorting, and analysis of massive amounts of information have enabled the automation of decision-making at an unprecedented level. Meanwhile, computers have offered a model of cognition that increasingly shapes our approach to the world. The proliferation of “roboprocesses” is the result, as editors Catherine Besteman and Hugh Gusterson observe in this rich and wide-ranging volume, which features contributions from a distinguished cast of scholars in anthropology, communications, international studies, and political science.
Although automatic processes are designed to be engines of rational systems, the stories in Life by Algorithms reveal how they can in fact produce absurd, inflexible, or even dangerous outcomes. Joining the call for “algorithmic transparency,” the contributors bring exceptional sensitivity to everyday sociality into their critique to better understand how the perils of modern technology affect finance, medicine, education, housing, the workplace, food production, public space, and emotions—not as separate problems but as linked manifestations of a deeper defect in the fundamental ordering of our society.
Catherine Besteman, Alex Blanchette, Robert W. Gehl, Hugh Gusterson, Catherine Lutz, Ann Lutz Fernandez, Joseph Masco, Sally Engle Merry, Keesha M. Middlemass, Noelle Stout, Susan J. Terrio
The fields of logic, linguistics and computer science are intimately related, and modern research has uncovered a wide range of connections. This collection focuses on work that is based on the unifying concept of information. This collection of nineteen papers covers subjects such as channel theory, presupposition and constraints, the modeling of discourse, and belief. They were all presented at the 1996 Conference on Information-Theoretic Approaches to Logic, Language, Information, and Computation.
Computing has moved away from a focus on performance-centric serial computation, instead towards energy-efficient parallel computation. This provides continued performance increases without increasing clock frequencies, and overcomes the thermal and power limitations of the dark-silicon era. As the number of parallel cores increases, we transition into the many-core computing era. There is considerable interest in developing methods, tools, architectures and applications to support many-core computing.
Can new technology enhance local, national, and global democracy? Online Deliberation is the first book that attempts to sample the full range of work on online deliberation, forging new connections between academic research, web designers, and practitioners.
Since the most exciting innovations in deliberation have occurred outside of traditional institutions, and those involved have often worked in relative isolation from each other, research conducted on this growing field has to this point neglected the full perspective of online participation. This volume, an essential read for those working at the crossroads of computer and social science, illuminates the collaborative world of deliberation by examining diverse clusters of Internet communities.
The field of weak arithmetics is an application of logical methods to number theory that was developed by mathematicians, philosophers, and theoretical computer scientists. In this volume, after a general presentation of weak arithmetics, the following topics are studied: the properties of integers of a real closed field equipped with exponentiation; conservation results for the induction schema restricted to first-order formulas with a finite number of alternations of quantifiers; a survey on a class of tools called pebble games; the fact that the reals e and pi have approximations expressed by first-order formulas using bounded quantifiers; properties of infinite pictures depending on the universe of sets used; a language that simulates in a sufficiently nice manner all algorithms of a certain restricted class; the logical complexity of the axiom of infinity in some variants of set theory without the axiom of foundation; and the complexity to determine whether a trace is included in another one.
The field of weak arithmetics is an application of logical methods to number theory that was developed by mathematicians, philosophers, and theoretical computer scientists. This third volume in the weak arithmetics collection contains nine substantive papers based on lectures delivered during the two last meetings of the conference series Journées sur les Arithmétiques, held in 2014 at the University of Gothenburg, Sweden, and in 2015 at the City University of New York Graduate Center.
How does a computer scientist understand infinity? What can probability theory teach us about free will? Can mathematical notions be used to enhance one's personal understanding of the Bible?
Perhaps no one is more qualified to address these questions than Donald E. Knuth, whose massive contributions to computing have led others to nickname him "The Father of Computer Science"—and whose religious faith led him to understand a fascinating analysis of the Bible called the 3:16 project. In this series of six spirited, informal lectures, Knuth explores the relationships between his vocation and his faith, revealing the unique perspective that his work with computing has lent to his understanding of God.
His starting point is the 3:16 project, an application of mathematical "random sampling" to the books of the Bible. The first lectures tell the story of the project's conception and execution, exploring its many dimensions of language translation, aesthetics, and theological history. Along the way, Knuth explains the many insights he gained from such interdisciplinary work. These theological musings culminate in a surprising final lecture tackling the ideas of infinity, free will, and some of the other big questions that lie at the juncture of theology and computation.
Things a Computer Scientist Rarely Talks About, with its charming and user-friendly format—each lecture ends with a question and answer exchange, and the book itself contains more than 100 illustrations—is a readable and intriguing approach to a crucial topic, certain to edify both those who are serious and curious about their faiths and those who look at the science of computation and wonder what it might teach them about their spiritual world.
Includes "Creativity, Spirituality, and Computer Science," a panel discussion featuring Harry Lewis, Guy L. Steele, Jr., Manuela Veloso, Donald E. Knuth, and Mitch Kapor.
Videogames are firmly enmeshed in modern culture. Acknowledging the increasing cultural impact of this rapidly changing industry on artistic and creative practices, Videogames and Art features in-depth essays that offer an unparalleled overview of the field.
Together, the contributions position videogame art as an interdisciplinary mix of digital technologies and the traditional art forms. Of particular interest in this volume are machinima, game console artwork, politically oriented videogame art, and the production of digital art. This new and revised edition features an extended critical introduction from the editors and updated interviews with the foremost artists in the field. Rounding out the book is a critique of the commercial videogame industry comprising essays on the current quality and originality of videogames.
The importance of testing integrated circuits (ICs) has escalated with the increasing complexity of circuits fabricated on a single IC chip. No longer is it possible to design a new IC and then think about testing: such considerations must be part of the initial design activity, and testing strategies should be part of every circuit and system designer's education. This book is a comprehensive introduction and reference for all aspects of IC testing. It includes all of the basic concepts and theories necessary for advanced students, from practical test strategies and industrial practice, to the economic and managerial aspects of testing. In addition to detailed coverage of digital network testing, VLSI testing also considers in depth the growing area of testing analogue and mixed analogue/digital ICs, used particularly in signal processing.