"Accountability" is a watchword of our era. Dissatisfaction with a range of public and private institutions is widespread and often expressed in strong critical rhetoric. The reasons for these views are varied and difficult to translate into concrete action, but this hasn't deterred governments and nongovernmental organizations from putting into place formal processes for determining whether their own and others' goals have been achieved and problems with performance have been avoided.
In this thought-provoking book, government and public administration scholar Beryl Radin takes on many of the assumptions of the performance movement, arguing that evaluation relies too often on simplistic, one-size-fits-all solutions that are not always effective for dynamic organizations. Drawing on a wide range of ideas, including theories of intelligence and modes of thought, assumptions about numbers and information, and the nature of professionalism, Radin sheds light on the hidden complexities of creating standards to evaluate performance. She illustrates these problems by discussing a range of program areas, including health efforts as well as the education program, "No Child Left Behind."
Throughout, the author devotes particular attention to concerns about government standards, from accounting for issues of equity to allowing for complicated intergovernmental relationships and fragmentation of powers. She explores in detail how recent performance measurement efforts in the U.S. government have fared, and analyzes efforts by nongovernmental organizations both inside and outside of the United States to impose standards of integrity and equity on their governments. The examination concludes with alternative assumptions and lessons for those embarking on performance measurement activities.
Marine pollution causes significant damage to fisheries and other economically productive uses of the ocean. The value of that damage can be quantified by economists, but the meanings of those valuations and how they are derived are often obscure to noneconomists.Economic Losses from Marine Pollution brings a fuller understanding of the variety and extent of marine losses and how they are assessed to scientists, lawyers, and environmentalists by systematically identifying and classifying marine losses and relating them to models and methods of economic valuation. The authors use a step-by-step approach to show how economists have used these methods and how they approach the problem of assessing economic damage.The book begins by describing the importance of economic valuation of marine damages, the history of concern over marine pollution, and the development of economic methodologies to assess damage from it. Following that, the book: considers types of marine pollution and their effects on organisms, ecosystems, and humans, and the corresponding economic effects of those biological impacts introduces the economic principles and methods needed to understand and to assess economic damages expresses losses from water quality impairments in terms of economic value introduces the basic economic techniques that have been developed and used to measure changes in economic value discusses how to apply those economic techniques, and presents a variety of practical examples explores limitations and problems that can arise in such applied work.Economic Losses from Marine Pollution includes all of the relevant economic theory together with specific examples of how that theory has been and can be applied. It offers environmental professionals with little or no background in economics the basic economic tools needed to understand economic valuations of environmental damage, and represents a unique handbook for environmental and marine scientists, lawyers, economists, policy professionals, and anyone interested in issues of marine water quality.
Electromagnetic Field Standards and Exposure Systems covers the broader fields of measurements in telecommunications, radio navigation, radio astronomy, bioscience, and free ranging EM radiation and helps to develop the following measurement standards;
proper calibration of the measuring instrument
external environmental factors that affect accuracy
competence and training of the instrument operator
This book is devoted to the specific problems of electromagnetic field (EMF) measurements in the near field and to the analysis of the main factors which impede accuracy in these measurements. It focuses on careful and accurate design of systems to measure in the near field based on a thorough understanding of the fundamental engineering principles and on an analysis of the likely system errors. Beginning with a short introduction to electromagnetic fields with an emphasis on the near field, it then presents methods of EMF measurements in near field conditions. It details the factors limiting measurement accuracy including internal ones (thermal stability, frequency response, dynamic characteristics, susceptibility) and external ones (field integration, mutual couplings between a probe and primary and secondary EMF sources, directional pattern deformations). It continues with a discussion on how to gauge the parameters declared by an EMF meter manufacturer and simple methods for testing these parameters. It also details how designers of measuring equipment can reconsider the near field when designing and testing, as well as how users can exploit the knowledge within the book to ensure their tests and results contain the most accurate measurements possible. The SciTech Publishing Series on Electromagnetic Compatibility provides a continuously growing body of knowledge in the latest development and best practices in electromagnetic compatibility engineering. This series provides specialist and non-specialist professionals and students practical knowledge that is thoroughly grounded in relevant theory.
As much as one-tenth of the world’s oceans are covered with sea ice, or frozen ocean water, at some point during the annual cycle. Sea ice thus plays an important, often defining, role in the natural environment and the global climate system. This book is a global look at the changes in sea ice and the tools and techniques used to measure and record those changes. The first comprehensive research done on sea-ice field techniques, this volume will be indispensable for the study of northern sea ice and a must-have for scientists in the field of climate change research.
According to the theory of relativity, we are constantly bathed in gravitational radiation. When stars explode or collide, a portion of their mass becomes energy that disturbs the very fabric of the space-time continuum like ripples in a pond. But proving the existence of these waves has been difficult; the cosmic shudders are so weak that only the most sensitive instruments can be expected to observe them directly. Fifteen times during the last thirty years scientists have claimed to have detected gravitational waves, but so far none of those claims have survived the scrutiny of the scientific community. Gravity's Shadow chronicles the forty-year effort to detect gravitational waves, while exploring the meaning of scientific knowledge and the nature of expertise.
Gravitational wave detection involves recording the collisions, explosions, and trembling of stars and black holes by evaluating the smallest changes ever measured. Because gravitational waves are so faint, their detection will come not in an exuberant moment of discovery but through a chain of inference; for forty years, scientists have debated whether there is anything to detect and whether it has yet been detected. Sociologist Harry Collins has been tracking the progress of this research since 1972, interviewing key scientists and delineating the social process of the science of gravitational waves.
Engagingly written and authoritatively comprehensive, Gravity's Shadow explores the people, institutions, and government organizations involved in the detection of gravitational waves. This sociological history will prove essential not only to sociologists and historians of science but to scientists themselves.
Von den Driesch's handbook is the standard tool used by faunal analysts working on animal and bird assemblages from around the world. Developed for the instruction of students working on osteoarchaeological theses at the University of Munich, the guide has standardized how animal bones recovered from prehistoric and early historic sites are measured.
During the 1980s the worldwide interest in electromagnetic compatibility (EMC) grew rapidly with the introduction of legislation to control the growing interference problems generated by the increased use of electronic equipment in industry and in the home. The European directive harmonising EMC measurements gave particular impetus to manufacturers and importers of electrical and electronic equipment in Europe to understand EMC design techniques and verification procedures. This book explains how equipment can be verified by testing. It discusses the nature of EMC standards world wide and describes in detail testing methods and their conduct and accuracy. In addition to standard EMC testing, topics including electrostatic discharge, nuclear electromagnetic pulse and lightning are also discussed.
In the broad span of its subject matter, the interests of equipment manufacturers, EMC test engineers, project managers and company administrators are addressed. The testing of both military and commercial electronic equipment is covered. Particular emphasis is placed on the nature of EMC test equipment and how to use it to make reliable measurements.
This book updates and expands the editor's acclaimed Electrical Resistivity Handbook, bringing together advances in the field over the last two decades. In this period, much has been achieved in the fields of new materials and superconductivity.
This new volume provides a comprehensive compilation of experimental data in graphical form of the resistivity/resistance of over 1000 elements, compounds and alloys in three sections. The first section deals with resistivity as a function of temperature, the second section deals with resistivity as a function of temperature and pressure, whilst the third deals with the normalised resistance of materials as a function of temperature and/or pressure.
As the United States moves to a low-carbon economy in order to combat global warming, credits for reducing carbon dioxide emissions will increasingly become a commodity that is bought and sold on the open market. Farmers and other landowners can benefit from this new economy by conducting land management practices that help sequester carbon dioxide, creating credits they can sell to industry to “offset” industrial emissions of greenhouse gases.
This guide is the first comprehensive technical publication providing direction to landowners for sequestering carbon and information for traders and others who will need to verify the sequestration. It will provide invaluable direction to farmers, foresters, land managers, consultants, brokers, investors, regulators, and others interested in creating consistent, credible greenhouse gas offsets as a tradable commodity in the United States.
The guide contains a non-technical section detailing methodologies for scoping of the costs and benefits of a proposed project, quantifying offsets of various sorts under a range of situations and conditions, and verifying and registering the offsets. The technical section provides specific information for quantifying, verifying, and regulating offsets from agricultural and forestry practices.
Visit the Nicholas Institute for Environmental Policy Solutions website for audio from the press conference announcing the book. Read the press release announcing the book.
Improvements in the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics (including antennas, microwave devices and radar scattering applications) can be achieved through the use of 'high-order' techniques. This book outlines these techniques by presenting high-order basis functions, explaining their use, and illustrating their performance. The specific basis functions under consideration were developed by the authors, and include scalar and vector functions for use with equations such as the vector Helmholtz equation and the electric field integral equation.
The book starts by considering the approximation of scalar functions, and explores the error in some of those representations. Singular functions (those that are unbounded) are also considered, since these often arise in practical EM problems. The authors then discuss the approximation of vector functions, and summarize the various classes of vector basis functions that have been used by the professional community. Following this, they present higher-order basis functions for the most common cell shapes used in finite element analysis procedures. Finally, they consider some of the implementation details associated with the use of these functions for integral equation/method of moments formulations and differential equation/finite element method approaches.
This book provides an essential introduction to these techniques for researchers, graduate students and practicing professionals in the discipline of computational electromagnetics.
Robust and reliable measures of consumer expenditures are essential for analyzing aggregate economic activity and for measuring differences in household circumstances. Many countries, including the United States, are embarking on ambitious projects to redesign surveys of consumer expenditures, with the goal of better capturing economic heterogeneity. This is an appropriate time to examine the way consumer expenditures are currently measured, and the challenges and opportunities that alternative approaches might present.
Improving the Measurement of Consumer Expenditures begins with a comprehensive review of current methodologies for collecting consumer expenditure data. Subsequent chapters highlight the range of different objectives that expenditure surveys may satisfy, compare the data available from consumer expenditure surveys with that available from other sources, and describe how the United States’s current survey practices compare with those in other nations.
Few events in the history of humanity rival the Industrial Revolution. Following its onset in eighteenth-century Britain, sweeping changes in agriculture, manufacturing, transportation, and technology began to gain unstoppable momentum throughout Europe, North America, and eventually much of the world—with profound effects on socioeconomic and cultural conditions.
In The Institutional Revolution, Douglas W. Allen offers a thought-provoking account of another, quieter revolution that took place at the end of the eighteenth century and allowed for the full exploitation of the many new technological innovations. Fundamental to this shift were dramatic changes in institutions, or the rules that govern society, which reflected significant improvements in the ability to measure performance—whether of government officials, laborers, or naval officers—thereby reducing the role of nature and the hazards of variance in daily affairs. Along the way, Allen provides readers with a fascinating explanation of the critical roles played by seemingly bizarre institutions, from dueling to the purchase of one’s rank in the British Army.
Engagingly written, The Institutional Revolution traces the dramatic shift from premodern institutions based on patronage, purchase, and personal ties toward modern institutions based on standardization, merit, and wage labor—a shift which was crucial to the explosive economic growth of the Industrial Revolution.
How the government arrives at its official economic statistics deeply influences the lives of every American. Social Security payments and even some wages are linked to import prices through official inflation rates; special measures of national product are necessary for valid comparisons of vital social indicators such as relative standards of living and relative poverty. Poor information can result in poor policies. And yet, federal statistics agencies have been crippled by serious budget cuts—and more cuts may lie ahead.
Questioning the quality of current data and analytical procedures, this ambitious volume proposes innovative research designs and methods for data enhancement, and offers new data on trade prices and service transactions for future studies. Leading researchers address the measurement of international trade flows and prices, including the debate over measurement of computer prices and national productivity; compare international levels of manufacturing output; and assess the extent to which the United States has fallen into debt to the rest of the world.
Measurement is all around us—from the circumference of a pizza to the square footage of an apartment, from the length of a newborn baby to the number of miles between neighboring towns. Whether inches or miles, centimeters or kilometers, measures of distance stand at the very foundation of everything we do, so much so that we take them for granted. Yet, this has not always been the case.
This book reaches back to medieval Italy to speak of a time when measurements were displayed in the open, showing how such a deceptively simple innovation triggered a chain of cultural transformations whose consequences are visible today on a global scale. Drawing from literary works and frescoes, architectural surveys and legal compilations, Emanuele Lugli offers a history of material practices widely overlooked by historians. He argues that the public display of measurements in Italy’s newly formed city republics not only laid the foundation for now centuries-old practices of making, but also helped to legitimize local governments and shore up church power, buttressing fantasies of exactitude and certainty that linger to this day.
This ambitious, truly interdisciplinary book explains how measurements, rather than being mere descriptors of the real, themselves work as powerful molds of ideas, affecting our notions of what we consider similar, accurate, and truthful.
Paul Lockhart Harvard University Press, 2012 Library of Congress QA447.L625 2012 | Dewey Decimal 516
Lockhart’s Mathematician’s Lament outlined how we introduce math to students in the wrong way. Measurement explains how math should be done. With plain English and pictures, he makes complex ideas about shape and motion intuitive and graspable, and offers a solution to math phobia by introducing us to math as an artful way of thinking and living.
The Measurement of Capital
Edited by Dan Usher University of Chicago Press, 1980 Library of Congress HC106.3.C714 vol. 45 | Dewey Decimal 330.08
How is real capital measured by government statistical agencies? How could this measure be improved to correspond more closely to an economist's ideal measure of capital in economic analysis and prediction? It is possible to construct a single, reliable time series for all capital goods, regardless of differences in vintage, technological complexity, and rates of depreciation? These questions represent the common themes of this collection of papers, originally presented at a 1976 meeting of the Conference on Income and Wealth.
American business has recently been under fire, charged with inflated pricing and an inability to compete in the international marketplace. However, the evidence presented in this volume shows that the business community has been unfairly maligned—official measures of inflation and the standard of living have failed to account for progress in the quality of business equipment and consumer goods. Businesses have actually achieved higher productivity at lower prices, and new goods are lighter, faster, more energy efficient, and more reliable than their predecessors.
Robert J. Gordon has written the first full-scale work to treat the extent of quality changes over the entire range of durable goods, from autos to aircraft, computers to compressors, from televisions to tractors. He combines and extends existing methods of measurement, drawing data from industry sources, Consumer Reports, and the venerable Sears catalog.
Beyond his important finding that the American economy is more sound than officially recognized, Gordon provides a wealth of anecdotes tracing the postwar history of technological progress. Bolstering his argument that improved quality must be accurately measured, Gordon notes, for example, that today's mid-range personal computers outperform the multimillion-dollar mainframes of the 1970s. This remarkable book will be essential reading for economists and those in the business community.
The Measurement of Labor Cost
Edited by Jack E. Triplett University of Chicago Press, 1983 Library of Congress HC106.3.C714 vol. 48 | Dewey Decimal 330
Measuring costs of labor as a portion of total production costs has never before been treated so thoroughly or so thoughtfully. Moreover, contrary to most recent labor research, this book focuses on the demand side—the employer's point of view—and the behavior studied is employer behavior.
An introductory essay by the editor provides a useful guide to current thought in the analysis of labor cost. Other papers give new insights into problems encountered in accounting for the nonwage elements of labor compensation, the effect of pensions and other benefits, and the wage-measurement questions raised by incomes policies. In addition, there is a wealth of valuable new data on labor costs in the United States.
Labor economists, statisticians, econometric modelers, and advisers to government and industry will welcome this up-to-date and comprehensive treatment of the costs of production.
In this pioneering study, the authors
deal with the nature and theory of meaning and present a new, objective
method for its measurement which they call the semantic differential.
This instrument is not a specific test, but rather a general technique of
measurement that can be adapted to a wide variety of problems in such areas
as clinical psychology, social psychology, linguistics, mass communications,
esthetics, and political science. The core of the book is the authors' description,
application, and evaluation of this important tool and its far-reaching
implications for empirical research.
Measurement of Nontariff Barriers
Alan V. Deardorff and Robert M. Stern, Editors University of Michigan Press, 1998 Library of Congress HF1430.D428 1998 | Dewey Decimal 382.3
As tariffs on imports of manufactures have been reduced as a result of multi-lateral trade negotiations, interest in the extent to which existing nontariff barriers may distort and restrict international trade is growing. Accurate and reliable measures are needed in order to address the issues involving the use and impacts of nontariff barriers. This study assesses currently available methods for quantifying such barriers and makes recommendations as to those methods that can be most effectively employed. The authors focus both on the conceptual issues arising in the measurement of the different types of nontariff barriers and on the applied research that has been carried out in studies prepared by country members of the OECD Pilot Group and others seeking to quantify the barriers.
Nontariff barriers include quotas, variable levies, voluntary export restraints, government procurement regulations, domestic subsidies, and antidumping and countervailing duty measures. The authors discuss the many different methods available for measuring the effects of these and other nontariff barriers. Illustrative results are presented for industrial OECD countries, including Australia, Canada, Germany, Norway, the European Union, the United Kingdom, and the United States. Finally, the authors offer guideline principles and recommend procedures for measuring different types of nontariff barriers.
Economists, political scientists, government officials, and lawyers involved in international trade will find this an invaluable resource for understanding and measuring NTBs.
Alan V. Deardorff and Robert M. Stern are Professors of Economics and Public Policy, University of Michigan.
There is probably no concept other than saving for which U.S. official agencies issue annual estimates that differ by more than a third, as they have done for net household saving, or for which reputable scholars claim that the correct measure is close to ten times the officially published one. Yet despite agreement among economists and policymakers on the importance of this measure, huge inconsistencies persist.
Contributors to this volume investigate ways to improve aggregate and sectoral saving and investment estimates and analyze microdata from recent household wealth surveys. They provide analyses of National Income and Product Account (NIPA) and Flow-of-Funds measures and of saving and survey-based wealth estimates. Conceptual and methodological questions are discussed regarding long-term trends in the U.S. wealth inequality, age-wealth profiles, pensions and wealth distribution, and biases in inferences about life-cycle changes in saving and wealth. Some new assessments are offered for investment in human and nonhuman capital, the government contribution to national wealth, NIPA personal and corporate saving, and banking imputation.
The Measurement of Urban Home Environment was first published in 1936. Minnesota Archive Editions uses digital technology to make long-unavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.No. 11. Institute of Child Welfare Monograph SeriesThis volume contributes a validation and standardization of The Minnesota Home Status Index; a scale constructed by Professor Alice Leahy that gives numerical expression to the nature and extent of variation existing in living conditions of urban homes.Leahy describes the methods used in constructing the index and discusses the significance of her findings. Also included are accounts of previous studies in this field, bibliography, and appendix of schedules used in Leahy’s investigation.
Measuring and Modeling Health Care Costs
Edited by Ana Aizcorbe, Colin Baker, Ernst R. Berndt, and David M. Cutler University of Chicago Press, 2018 Library of Congress RA410.53.M387 2018 | Dewey Decimal 338.4336210973
Health care costs represent a nearly 18% of U.S. gross domestic product and 20% of government spending. While there is detailed information on where these health care dollars are spent, there is much less evidence on how this spending affects health.
The research in Measuring and Modeling Health Care Costs seeks to connect our knowledge of expenditures with what we are able to measure of results, probing questions of methodology, changes in the pharmaceutical industry, and the shifting landscape of physician practice. The research in this volume investigates, for example, obesity’s effect on health care spending, the effect of generic pharmaceutical releases on the market, and the disparity between disease-based and population-based spending measures. This vast and varied volume applies a range of economic tools to the analysis of health care and health outcomes.
Practical and descriptive, this new volume in the Studies in Income and Wealth series is full of insights relevant to health policy students and specialists alike.
Since the Great Depression, researchers and statisticians have recognized the need for more extensive methods for measuring economic growth and sustainability. The recent recession renewed commitments to closing long-standing gaps in economic measurement, including those related to sustainability and well-being.
The latest in the NBER’s influential Studies in Income and Wealth series, which has played a key role in the development of national account statistics in the United States and other nations, this volume explores collaborative solutions between academics, policy researchers, and official statisticians to some of today’s most important economic measurement challenges. Contributors to this volume extend past research on the integration and extension of national accounts to establish an even more comprehensive understanding of the distribution of economic growth and its impact on well-being, including health, human capital, and the environment. The research contributions assess, among other topics, specific conceptual and empirical proposals for extending national accounts.
Segmenting the environment surrounding an autonomous vehicle into coherently moving regions is a vital first step towards intelligent autonomous navigation. Without this temporal information, navigation becomes a simple obstacle avoidance scheme that is inappropriate in highly dynamic environments such as roadways and places where many people congregate.
The book begins by looking at the problem of motion estimation from biological, algorithmic and digital perspectives. It goes on to describe an algorithm that fits with the motion processing model, and hardware and software constraints. This algorithm is based on the optical flow constraint equation and introduces range information to resolve the depth-velocity ambiguity, which is critical for autonomous navigation. Finally, implementation of the algorithm in digital hardware is described in detail, covering both the initial motion processing model and the chosen hardware platforms, and the global functional structure of the system.
Price Index Concepts and Measurement
Edited by W. Erwin Diewert, John Greenlees, and Charles R. Hulten University of Chicago Press, 2009 Library of Congress HC106.3.C714 2009 | Dewey Decimal 338.528
Although inflation is much feared for its negative effects on the economy, how to measure it is a matter of considerable debate that has important implications for interest rates, monetary supply, and investment and spending decisions. Underlying many of these issues is the concept of the Cost-of-Living Index (COLI) and its controversial role as the methodological foundation for the Consumer Price Index (CPI).
Price Index Concepts and Measurements brings together leading experts to address the many questions involved in conceptualizing and measuring inflation. They evaluate the accuracy of COLI, a Cost-of-Goods Index, and a variety of other methodological frameworks as the bases for consumer price construction.
A. M. Olevskii, "Fourier Quasicrystals"
Marianna Csörnyei, "Tangents of Curves and Dierentiability of Functions"
Zoltán Buczolich, "Convergence of Ergodic Averages for Many Group Rotations"
Krzysztof Chris Ciesielski, David Miller, "A Continuous Tale on Continuous and Separately Continuous Functions"
Pongpol Ruankong, Songkiat Sumetkijakan, "Essential Closures"
Bruce H. Hanson, "Sets of Non-Differentiability for Functions with Finite Lower Scaled Oscillation"
Horst Alzer, "Inequalities for Mean Values in Two Variables"
Robert Menkyna, "On the Differences of Lower Semicontinuous Functions
S. N. Mukhopadhyay, S. Ray, "Relation Between Lp-Derivates and Peano, Approximate Peano and Borel Derivates of Higher Order"
Igor L. Bloshanskii, Denis A. Grafov, "Sufficient Conditions for Convergence Almost Everywhere of Multiple Trigonometric Fourier Series with Lacunary Sequence of Partial Sums"
Brian S. Thomson, "On VBG Functions and the Denjoy-Khintchine Integral"
Péter Komjáth, "A Certain 2-Coloring of the Reals"
Alexander Kharazishvili, "Absolute Null Subsets of the Plane with Bad Orthogonal Projections"
René E. Castillo, Julio C. Ramos-Fernández, Margot Salas-Brown, "The Essential Norm of Multiplication Operators on Lorentz Sequence Spaces"
John C. Morgan II, "Completion from an Abstract Perspective"
M. Archana, V. Kannan, "Intervals Containing All the Periodic Points"
Ryan M. Berndt, Greg G. Oman, "Turning Automatic Continuity Around: Automatic Homomorphisms"
Dariusz Kosz, "On the Discretization Technique for the Hardy-Littlewood Maximal Operators"
Volodymyr Mykhaylyuk, "On the Mixed Derivatives of a Separately Twice Differentiable Function"
Jimmy Tseng, "Nondense Orbits for Anosov Dieomorphisms of the 2-Torus"
Balazs Maga, "Accumulation Points of Graphs of Baire-1 and Baire-2 Functions"
Kevin Beanland, Paul D. Humke, Trevor Richards, "On Scottish Book Problem 157"
Emma D'Aniello, T. H. Steele, "A Non Self-Similar Set"
Jan A. Grzesik, "Contour Integration Underlies Fundamental Bernoulli Number Recurrence"
Pier Domenico Lamberti, Giorgio Stefani, "Sobolev Subspaces of Nowhere Bounded Functions"
Oswaldo de Oliveira, "The Implicit Function Theorem When the Partial Jacobian Matrix Is Only Continuous at the Base Point"
Joseph L. Gerver, "A Nice Example of Lebesgue Integration"
Chris Freiling, Richard J. O'Malley, Paul D. Humke, "An Alternate Solution to Scottish Book 157"
Chris Freiling, Richard J. O'Malley, Paul D. Humke, "Approximately Continuous Functions Have Approximate Extrema, a New Proof"
Miroslav Zelený, Characterizations of σ-Porosity Using Infinite Games
Alica Miller, On Various Conditions that Imply Sensitivity of Monoid Actions
Hajrudin Fejzić, "Divided Differences and Peano Derivatives"
Luisa Di Piazza, "Integrals and Selections of Multifunctions with Values in an Arbitrary Banach Space"
Michael Dymond, Beata Randrianantoanina, Huaqiang Xu, "On Interval Based Generalizations of Absolute Continuity for Functions on Rn"
Changhao Chen, "A Class of Random Cantor Sets"
Silvestru S. Dragomir, "Variance Jensen Type Inequalities for General Lebesgue Integral with Applications"
Carl P. Dettmann, Mrinal Kanti Roychowdhury, "Quantization for Uniform Distributions on Equilateral Triangles"
Vassiliki Farmaki, Andreas Mitropoulos, "The l1-Dichotomy Theorem with Respect to a Coideal"
J. Marshall Ash, Stefan Catoiu, "Directional Differentiability in the Euclidean Plane"
Peter Fletcher, Karel Hrbacek, Vladimir Kanovei, Mikhail G. Katz, Claude Lobry, Sam Sanders, "Approaches to Analysis with Infinitesimals Following Robinson, Nelson, and Others"
Jonathan M. Fraser, Eric J. Olson, James C. Robinson, "Some Results in Support of the Kakeya Conjecture"
Michał Korch, "A Generalized Egorov's Statement for Ideals"
Redouane Sayyad, "The McShane Integral in the Limit"
Grigori A. Karagulyan, "On Exceptional Sets of the Hilbert Transform"
Franklin R. Astudillo-Villaba, René E. Castillo, Julio C. Ramos-Fernández, "Multiplication Operators on the Spaces of Functions of Bounded p-Variation in Wiener's Sense"
Ivan Werner, "On the Carathéodory Approach to the Construction of a Measure"
Vladimir Kanovei, Mikhail Katz, "A Positive Function with Vanishing Lebesgue Integral in Zermelo–Fraenkel Set Theory"
Badreddine Meftah, "On Some Gamidov Integral Inequalities on Time Scales and Applications"
Iwo Labuda, "Measure, Category and Convergent Series"
Hajrudin Fejzić, "A Note on Monotonicity Theorems for Approximately Continuous Functions"
Kathryn E. Hare and Kevin G. Hare, "Local Dimensions of Overlapping Self-Similar Measures"
Henry D. Riely, "A Modication of the Chang-Wilson-Wolff Inequality via the Bellman Function"
S. N. Mukhopadhyay and S. Ray, "Riemann Summability of Trigonometric Series and Riemann Derivatives of Real Functions"
Wahida Kaidouchi, Badreddine Meftah, Meryem Benssaad, and Sarra Ghomrani, "Fractional Hermite-Hadamard Type Integral Inequalities for Functions whose Modulus of the Mixed Derivatives are Co-Ordinated Extended (s1;m1)-(s2;m2)-Preinvex"
Antonio Boccuto, "Hahn-Banach-Type Theorems and Applications to Optimization for Partially Ordered Vector Space-Valued Invariant Operators"
Jaroslav Lukeš and Petr Pošta, "Approximations by Differences of Lower Semicontinuous and Finely Continuous Functions"
Rodrigo López Pouso, "Fourier Method Revised to Solve Partial Differential Equations and Prove Uniqueness at One Stroke"
Savita Bhatnagar, "The Radon Nikodym Property and Multipliers of HK-Integrable Functions"
Steven G. Krantz, "Verifying Differentiability Without Calculating the Derivative
Ricky F. Rulete and Mhelmar A. Labendia, "A Descriptive Denition of the Backwards Itô-Henstock Integral"
Fábio M. S. Lima, "A Bridge Between the Unit Square and Single Integrals for Real Functions of the Form f(x • y)"
Systems with Small Dissipation
V. B. Braginsky, V. P. Mitrofanov, and V. I. Panov University of Chicago Press, 1985 Library of Congress QC39.B66313 1985 | Dewey Decimal 530.8
Electromagnetic and mechanical oscillators are crucial in such diverse fields as electrical engineering, microwave technology, optical technology, and experimental physics. For example, such oscillators are the key elements in instruments for detecting extremely weak mechanical forces and electromagnetic signals are essential to highly stable standards of time and frequency. The central problem in developing such instruments is to construct oscillators that are as perfectly simple harmonic as possible; the largest obstacle is the oscillator's dissipation and the fluctuating forces associated with it.
This book, first published in Russian in 1981 and updated with new data for this English edition, is a treatise on the sources of dissipation and other defects in mechanical and electromagnetic oscillators and on practical techniques for minimizing such defects. Written by a team of researchers from Moscow State University who are leading experts in the field, the book is a virtual encyclopedia of theoretical formulas, experimental techniques, and practical lore derived from twenty-five years of experience. Intended for the experimenter who wishes to construct near-perfect instrumentation, the book provides information on everything from the role of phonon-phonon scattering as a fundamental source of dissipation to the effectiveness of a thin film of pork fat in reducing the friction between a support wire and a mechanically oscillating sapphire crystal.
The researchers that V. B. Braginsky has led since the mid-1960s are best known in the West for their contributions to the technology of gravitational-wave detection, their experimental search for quarks, their test of the equivalency principle, and their invention of new experimental techniques for high-precision measurement, including "quantum nondemolition movements." Here, for the first time, they provide a thorough overview of the practical knowledge and experimental methods that have earned them a worldwide reputation for ingenuity, talent, and successful technique.