Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
8,978
result(s) for
"Quantification (science)"
Sort by:
The Search for Mathematical Roots, 1870-1940
2011
While many books have been written about Bertrand Russell's philosophy and some on his logic, I. Grattan-Guinness has written the first comprehensive history of the mathematical background, content, and impact of the mathematical logic and philosophy of mathematics that Russell developed with A. N. Whitehead in theirPrincipia mathematica (1910-1913).
This definitive history of a critical period in mathematics includes detailed accounts of the two principal influences upon Russell around 1900: the set theory of Cantor and the mathematical logic of Peano and his followers. Substantial surveys are provided of many related topics and figures of the late nineteenth century: the foundations of mathematical analysis under Weierstrass; the creation of algebraic logic by De Morgan, Boole, Peirce, Schröder, and Jevons; the contributions of Dedekind and Frege; the phenomenology of Husserl; and the proof theory of Hilbert. The many-sided story of the reception is recorded up to 1940, including the rise of logic in Poland and the impact on Vienna Circle philosophers Carnap and Gödel. A strong American theme runs though the story, beginning with the mathematician E. H. Moore and the philosopher Josiah Royce, and stretching through the emergence of Church and Quine, and the 1930s immigration of Carnap and GödeI.
Grattan-Guinness draws on around fifty manuscript collections, including the Russell Archives, as well as many original reviews. The bibliography comprises around 1,900 items, bringing to light a wealth of primary materials.
Written for mathematicians, logicians, historians, and philosophers--especially those interested in the historical interaction between these disciplines--this authoritative account tells an important story from its most neglected point of view. Whitehead and Russell hoped to show that (much of) mathematics was expressible within their logic; they failed in various ways, but no definitive alternative position emerged then or since.
The measure of civilization
2013
In the last thirty years, there have been fierce debates over how civilizations develop and why the West became so powerful.The Measure of Civilizationpresents a brand-new way of investigating these questions and provides new tools for assessing the long-term growth of societies. Using a groundbreaking numerical index of social development that compares societies in different times and places, award-winning author Ian Morris sets forth a sweeping examination of Eastern and Western development across 15,000 years since the end of the last ice age. He offers surprising conclusions about when and why the West came to dominate the world and fresh perspectives for thinking about the twenty-first century.
Adapting the United Nations' approach for measuring human development, Morris's index breaks social development into four traits--energy capture per capita, organization, information technology, and war-making capacity--and he uses archaeological, historical, and current government data to quantify patterns. Morris reveals that for 90 percent of the time since the last ice age, the world's most advanced region has been at the western end of Eurasia, but contrary to what many historians once believed, there were roughly 1,200 years--from about 550 to 1750 CE--when an East Asian region was more advanced. Only in the late eighteenth century CE, when northwest Europeans tapped into the energy trapped in fossil fuels, did the West leap ahead.
Resolving some of the biggest debates in global history,The Measure of Civilizationputs forth innovative tools for determining past, present, and future economic and social trends.
What is meaning?
2010
The tradition descending from Frege and Russell has typically treated theories of meaning either as theories of meanings (propositions expressed), or as theories of truth conditions. However, propositions of the classical sort don't exist, and truth conditions can't provide all the information required by a theory of meaning. In this book, one of the world's leading philosophers of language offers a way out of this dilemma.
Traditionally conceived, propositions are denizens of a \"third realm\" beyond mind and matter, \"grasped\" by mysterious Platonic intuition. As conceived here, they are cognitive-event types in which agents predicate properties and relations of things--in using language, in perception, and in nonlinguistic thought. Because of this, one's acquaintance with, and knowledge of, propositions is acquaintance with, and knowledge of, events of one's cognitive life. This view also solves the problem of \"the unity of the proposition\" by explaining how propositions can be genuinely representational, and therefore bearers of truth. The problem, in the traditional conception, is that sentences, utterances, and mental states are representational because of the relations they bear to inherently representational Platonic complexes of universals and particulars. Since we have no way of understanding how such structures can be representational, independent of interpretations placed on them by agents, the problem is unsolvable when so conceived. However, when propositions are taken to be cognitive-event types, the order of explanation is reversed and a natural solution emerges. Propositions are representational because they are constitutively related to inherently representational cognitive acts.
Strikingly original,What Is Meaning?is a major advance.
Reference and description
2005,2009,2004
In this book, Scott Soames defends the revolution in philosophy led by Saul Kripke, Hilary Putnam, and David Kaplan against attack from those wishing to revive descriptivism in the philosophy of language, internalism in the philosophy of mind, and conceptualism in the foundations of modality. Soames explains how, in the last twenty-five years, this attack on the anti-descriptivist revolution has coalesced around a technical development called two-dimensional modal logic that seeks to reinterpret the Kripkean categories of the necessary aposteriori and the contingent apriori in ways that drain them of their far-reaching philosophical significance.
Arguing against this reinterpretation, Soames shows how the descriptivist revival has been aided by puzzles and problems ushered in by the anti-descriptivist revolution, as well as by certain errors and missteps in the anti-descriptivist classics themselves.Reference and Descriptionsorts through all this, assesses and consolidates the genuine legacy of Kripke and Kaplan, and launches a thorough and devastating critique of the two-dimensionalist revival of descriptivism. Through it all, Soames attempts to provide the outlines of a lasting, nondescriptivist perspective on meaning, and a nonconceptualist understanding of modality.
The blind spot
In today's unpredictable and chaotic world, we look to science to provide certainty and answers--and often blame it when things go wrong.The Blind Spotreveals why our faith in scientific certainty is a dangerous illusion, and how only by embracing science's inherent ambiguities and paradoxes can we truly appreciate its beauty and harness its potential.
Crackling with insights into our most perplexing contemporary dilemmas, from climate change to the global financial meltdown, this book challenges our most sacredly held beliefs about science, technology, and progress. At the same time, it shows how the secret to better science can be found where we least expect it--in the uncertain, the ambiguous, and the inevitably unpredictable. William Byers explains why the subjective element in scientific inquiry is in fact what makes it so dynamic, and deftly balances the need for certainty and rigor in science with the equally important need for creativity, freedom, and downright wonder. Drawing on an array of fascinating examples--from Wall Street's overreliance on algorithms to provide certainty in uncertain markets, to undecidable problems in mathematics and computer science, to Georg Cantor's paradoxical but true assertion about infinity--Byers demonstrates how we can and must learn from the existence of blind spots in our scientific and mathematical understanding.
The Blind Spotoffers an entirely new way of thinking about science, one that highlights its strengths and limitations, its unrealized promise, and, above all, its unavoidable ambiguity. It also points to a more sophisticated approach to the most intractable problems of our time.
The birth of model theory
2004,2009
Löwenheim's theorem reflects a critical point in the history of mathematical logic, for it marks the birth of model theory--that is, the part of logic that concerns the relationship between formal theories and their models. However, while the original proofs of other, comparably significant theorems are well understood, this is not the case with Löwenheim's theorem. For example, the very result that scholars attribute to Löwenheim today is not the one that Skolem--a logician raised in the algebraic tradition, like Löwenheim--appears to have attributed to him. InThe Birth of Model Theory, Calixto Badesa provides both the first sustained, book-length analysis of Löwenheim's proof and a detailed description of the theoretical framework--and, in particular, of the algebraic tradition--that made the theorem possible.
Badesa's three main conclusions amount to a completely new interpretation of the proof, one that sharply contradicts the core of modern scholarship on the topic. First, Löwenheim did not use an infinitary language to prove his theorem; second, the functional interpretation of Löwenheim's normal form is anachronistic, and inappropriate for reconstructing the proof; and third, Löwenheim did not aim to prove the theorem's weakest version but the stronger version Skolem attributed to him. This book will be of considerable interest to historians of logic, logicians, philosophers of logic, and philosophers of mathematics.
Comparison between manta trawl and in situ pump filtration methods, and guidance for visual identification of microplastics in surface waters
by
Karlsson, Therese M.
,
Hassellöv, Martin
,
Kärrman, Anna
in
abundance
,
Aquatic Pollution
,
Atmospheric Protection/Air Quality Control/Air Pollution
2020
Owing to the development and adoption of a variety of methods for sampling and identifying microplastics, there is now data showing the presence of microplastics in surface waters from all over the world. The difference between the methods, however, hampers comparisons, and to date, most studies are qualitative rather than quantitative. In order to allow for a quantitative comparison of microplastics abundance, it is crucial to understand the differences between sampling methods. Therefore, a manta trawl and an in situ filtering pump were compared during realistic, but controlled, field tests. Identical microplastic analyses of all replicates allowed the differences between the methods with respect to (1) precision, (2) concentrations, and (3) composition to be assessed. The results show that the pump gave higher accuracy with respect to volume than the trawl. The trawl, however, sampled higher concentrations, which appeared to be due to a more efficient sampling of particles on the sea surface microlayer, such as expanded polystyrene and air-filled microspheres. The trawl also sampled a higher volume, which decreased statistical counting uncertainties. A key finding in this study was that, regardless of sampling method, it is critical that a sufficiently high volume is sampled to provide enough particles for statistical evaluation. Due to the patchiness of this type of contaminant, our data indicate that a minimum of 26 particles per sample should be recorded to allow for concentration comparisons and to avoid false null values. The necessary amount of replicates to detect temporal or spatial differences is also discussed. For compositional differences and size distributions, even higher particle counts would be necessary. Quantitative measurements and comparisons would also require an unbiased approach towards both visual and spectroscopic identification. To facilitate the development of such methods, a visual protocol that can be further developed to fit different needs is introduced and discussed. Some of the challenges encountered while using FTIR microspectroscopic particle identification are also critically discussed in relation to specific compositions found.
Journal Article
Deep learning for post-processing ensemble weather forecasts
2021
Quantifying uncertainty in weather forecasts is critical, especially for predicting extreme weather events. This is typically accomplished with ensemble prediction systems, which consist of many perturbed numerical weather simulations, or trajectories, run in parallel. These systems are associated with a high computational cost and often involve statistical post-processing steps to inexpensively improve their raw prediction qualities. We propose a mixed model that uses only a subset of the original weather trajectories combined with a post-processing step using deep neural networks. These enable the model to account for non-linear relationships that are not captured by current numerical models or post-processing methods. Applied to the global data, our mixed models achieve a relative improvement in ensemble forecast skill (CRPS) of over 14%. Furthermore, we demonstrate that the improvement is larger for extreme weather events on select case studies. We also show that our post-processing can use fewer trajectories to achieve comparable results to the full ensemble. By using fewer trajectories, the computational costs of an ensemble prediction system can be reduced, allowing it to run at higher resolution and produce more accurate forecasts. This article is part of the theme issue ‘Machine learning for weather and climate modelling’.
Journal Article
Cytokines: From Clinical Significance to Quantification
by
Liu, Chao
,
Kalantar‐Zadeh, Kourosh
,
George, Jacob
in
Biomarkers - blood
,
Biosensors
,
Clinical significance
2021
Cytokines are critical mediators that oversee and regulate immune and inflammatory responses via complex networks and serve as biomarkers for many diseases. Quantification of cytokines has significant value in both clinical medicine and biology as the levels provide insights into physiological and pathological processes and can be used to aid diagnosis and treatment. Cytokines and their clinical significance are introduced from the perspective of their pro‐ and anti‐inflammatory effects. Factors affecting cytokines quantification in biological fluids, native levels in different body fluids, sample processing and storage conditions, sensitivity to freeze‐thaw, and soluble cytokine receptors are discussed. In addition, recent advances in in vitro and in vivo assays, biosensors based on different signal outputs and intracellular to extracellular protein expression are summarized. Various quantification platforms for high‐sensitivity and reliable measurement of cytokines in different scenarios are discussed, and commercially available cytokine assays are compared. A discussion of challenges in the development and advancement of technologies for cytokine quantification that aim to achieve real‐time multiplex cytokine analysis for point‐of‐care situations applicable for both biomedical research and clinical practice are discussed. Cytokines are important cellular signaling molecules and immune system mediators. Abnormal cytokine levels may cause cytokine storm and diseases. Consequently, quantification of cytokines is valuable for diseases diagnosisand therapy. The clinical significance of cytokines, factors affecting cytokine quantification, and advances of cytokine detection are summarized, providing a prospective for real‐time quantification of multiplex cytokines in the clinic.
Journal Article
Quantifying Streambed Grain Size, Uncertainty, and Hydrobiogeochemical Parameters Using Machine Learning Model YOLO
by
Chen, Rongyao
,
Renteria, Lupita
,
Goldman, Amy E.
in
Accuracy
,
Artificial intelligence
,
Biogeochemistry
2024
Streambed grain sizes control river hydro‐biogeochemical (HBGC) processes and functions. However, measuring their quantities, distributions, and uncertainties is challenging due to the diversity and heterogeneity of natural streams. This work presents a photo‐driven, artificial intelligence (AI)‐enabled, and theory‐based workflow for extracting the quantities, distributions, and uncertainties of streambed grain sizes from photos. Specifically, we first trained You Only Look Once, an object detection AI, using 11,977 grain labels from 36 photos collected from nine different stream environments. We demonstrated its accuracy with a coefficient of determination of 0.98, a Nash–Sutcliffe efficiency of 0.98, and a mean absolute relative error of 6.65% in predicting the median grain size of 20 ground‐truth photos representing nine typical stream environments. The AI is then used to extract the grain size distributions and determine their characteristic grain sizes, including the 10th, 50th, 60th, and 84th percentiles, for 1,999 photos taken at 66 sites within a watershed in the Northwest US. The results indicate that the 10th, median, 60th, and 84th percentiles of the grain sizes follow log‐normal distributions, with most likely values of 2.49, 6.62, 7.68, and 10.78 cm, respectively. The average uncertainties associated with these values are 9.70%, 7.33%, 9.27%, and 11.11%, respectively. These data allow for the computation of the quantities, distributions, and uncertainties of streambed HBGC parameters, including Manning's coefficient, Darcy‐Weisbach friction factor, top layer interstitial velocity magnitude, and nitrate uptake velocity. Additionally, major sources of uncertainty in grain sizes and their impact on HBGC parameters are examined. Plain Language Summary Streambed grain sizes control river hydro‐biogeochemical function by modulating the resistance, speed of water exchange, and nutrient transport at water‐sediment interface. Consequently, quantifying grain sizes and size‐dependent hydro‐biogeochemical parameters is critical for predicting river's function. In natural streams, measuring these sizes and parameters, however, is challenging because grain sizes vary from millimeters to a few meters, change from small creeks to big streams, and could be concealed by complex non‐grain materials such as water, ice, mud, and grasses. All these factors make the size measurements a time‐consuming and high‐uncertain task. We address these challenges by demonstrating a workflow that combines computer vision artificial intelligence (AI), smartphone photos, and new uncertainty quantification theories. The AI performs well across various sizes, locations, and stream environments as indicated by an accuracy metric of 0.98. We apply the AI to extract the grain sizes and their characteristic percentiles for 1,999 photos. These characteristic grain sizes are then input into existing and our new theories to derive the quantities, distributions, and uncertainties of hydrobiogeochemical parameters. The high accuracy of the AI and the success of extracting grain sizes and hydro‐biogeochemical parameters demonstrate the potential to advance river science with computer vision AI and mobile devices. Key Points Stream sediments bigger than 44 microns can be detected from smartphone photos by You Only Look Once with a Nash–Sutcliffe efficiency of 0.98 Quantities, distributions, and uncertainties of streambed grain sizes can be determined from photos Impact of grain size uncertainty on hydrobiogeochemical parameters is examined
Journal Article