Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
164
result(s) for
"Data sets History."
Sort by:
The Chinese computer : a global history of the information age
\"Exploration of the largely unknown history of Chinese-language computing systems, accessible to an audience unfamiliar with the Chinese language or the technical workings of personal computers\"-- Provided by publisher.
Life in 3-D: Life history strategies in tunas, mackerels and bonitos
by
Juan-Jordá, María José
,
Mosqueira, Iago
,
Freire, Juan
in
Aquaculture
,
Biomedical and Life Sciences
,
Body size
2013
MJJJ was supported in part by EU Marie Curie Early Stage Research Training project METAOCEANS, MEST-CT-2005-019678 and a Maria Barbeito Scholarship from Xunta de Galicia, Spain. NKD was supported by the Natural Environment Research Council of Canada.
Journal Article
How data happened : a history from the age of reason to the age of algorithms
\"From facial recognition--capable of checking people into flights or identifying undocumented residents--to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. Expanding on the popular course they created at Columbia University, Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data--where it has been and where it might yet go--Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose.\"-- Publisher marketing.
Probabilistic assessment of sea level during the last interglacial stage
by
Oppenheimer, Michael
,
Simons, Frederik J.
,
Mitrovica, Jerry X.
in
Algorithms
,
Analysis
,
Antarctic Regions
2009
With polar temperatures ∼3–5 °C warmer than today, the last interglacial stage (∼125 kyr ago) serves as a partial analogue for 1–2 °C global warming scenarios. Geological records from several sites indicate that local sea levels during the last interglacial were higher than today, but because local sea levels differ from global sea level, accurately reconstructing past global sea level requires an integrated analysis of globally distributed data sets. Here we present an extensive compilation of local sea level indicators and a statistical approach for estimating global sea level, local sea levels, ice sheet volumes and their associated uncertainties. We find a 95% probability that global sea level peaked at least 6.6 m higher than today during the last interglacial; it is likely (67% probability) to have exceeded 8.0 m but is unlikely (33% probability) to have exceeded 9.4 m. When global sea level was close to its current level (≥-10 m), the millennial average rate of global sea level rise is very likely to have exceeded 5.6 m kyr
-1
but is unlikely to have exceeded 9.2 m kyr
-1
. Our analysis extends previous last interglacial sea level studies by integrating literature observations within a probabilistic framework that accounts for the physics of sea level change. The results highlight the long-term vulnerability of ice sheets to even relatively low levels of sustained global warming.
A model of rising sea levels
Sea levels at the last interglacial, about 125,000 years ago, were higher and polar temperatures up to 5 °C warmer than today, so the period is seen as a partial analogue for what could happen in the event of anthropogenic warming. Kopp
et al
. assemble a global database of local sea level rise indicators and use a statistical treatment to estimate global sea level rise during the last interglacial. They find that global sea level was probably 8 to 9.4 metres above present levels, with the rate of sea level rise exceeding 50 cm per century. This suggests that today's ice sheets would be vulnerable to relatively low levels of global warming.
Sea levels during the last interglacial stage (about 125 kyr ago) are known to have been higher than today, and may serve as a partial analogue for anthropogenic warming scenarios. However, because local sea levels differ from global sea level, accurately reconstructing past global sea level requires an integrated analysis of globally distributed data sets. An extensive compilation of local sea level indicators and a statistical approach are now used to estimate global sea level during the last interglacial.
Journal Article
Model-Based Purchase Predictions for Large Assortments
2016
An accurate prediction of what a customer will purchase next is of paramount importance to successful online retailing. In practice, customer purchase history data is readily available to make such predictions, sometimes complemented with customer characteristics. Given the large product assortments maintained by online retailers, scalability of the prediction method is just as important as its accuracy. We study two classes of models that use such data to predict what a customer will buy next, i.e., a novel approach that uses latent Dirichlet allocation (LDA), and mixtures of Dirichlet-Multinomials (MDM). A key benefit of a model-based approach is the potential to accommodate observed customer heterogeneity through the inclusion of predictor variables. We show that LDA can be extended in this direction while retaining its scalability. We apply the models to purchase data from an online retailer and contrast their predictive performance with that of a collaborative filter and a discrete choice model. Both LDA and MDM outperform the other methods. Moreover, LDA attains performance similar to that of MDM while being far more scalable, rendering it a promising approach to purchase prediction in large product assortments.
Data, as supplemental material, are available at
http://dx.doi.org/10.1287/mksc.2016.0985
.
Journal Article
“Big Data” in Economic History
by
Gutmann, Myron P.
,
Merchant, Emily Klancher
,
Roberts, Evan
in
19th century
,
Archives & records
,
Archivists
2018
Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data – population and environment – discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.
Journal Article
PON-P2: Prediction Method for Fast and Reliable Identification of Harmful Variants
by
Vihinen, Mauno
,
Niroula, Abhishek
,
Urolagin, Siddhaling
in
Amino Acid Substitution
,
Amino acids
,
Annotations
2015
More reliable and faster prediction methods are needed to interpret enormous amounts of data generated by sequencing and genome projects. We have developed a new computational tool, PON-P2, for classification of amino acid substitutions in human proteins. The method is a machine learning-based classifier and groups the variants into pathogenic, neutral and unknown classes, on the basis of random forest probability score. PON-P2 is trained using pathogenic and neutral variants obtained from VariBench, a database for benchmark variation datasets. PON-P2 utilizes information about evolutionary conservation of sequences, physical and biochemical properties of amino acids, GO annotations and if available, functional annotations of variation sites. Extensive feature selection was performed to identify 8 informative features among altogether 622 features. PON-P2 consistently showed superior performance in comparison to existing state-of-the-art tools. In 10-fold cross-validation test, its accuracy and MCC are 0.90 and 0.80, respectively, and in the independent test, they are 0.86 and 0.71, respectively. The coverage of PON-P2 is 61.7% in the 10-fold cross-validation and 62.1% in the test dataset. PON-P2 is a powerful tool for screening harmful variants and for ranking and prioritizing experimental characterization. It is very fast making it capable of analyzing large variant datasets. PON-P2 is freely available at http://structure.bmc.lu.se/PON-P2/.
Journal Article
Non-gravitational acceleration in the trajectory of 1I/2017 U1 (‘Oumuamua)
by
Farnocchia, Davide
,
Ebeling, Harald
,
Chambers, Kenneth C.
in
639/33/34/4121
,
639/33/445/848
,
Acceleration
2018
‘Oumuamua (1I/2017 U1) is the first known object of interstellar origin to have entered the Solar System on an unbound and hyperbolic trajectory with respect to the Sun
1
. Various physical observations collected during its visit to the Solar System showed that it has an unusually elongated shape and a tumbling rotation state
1
–
4
and that the physical properties of its surface resemble those of cometary nuclei
5
,
6
, even though it showed no evidence of cometary activity
1
,
5
,
7
. The motion of all celestial bodies is governed mostly by gravity, but the trajectories of comets can also be affected by non-gravitational forces due to cometary outgassing
8
. Because non-gravitational accelerations are at least three to four orders of magnitude weaker than gravitational acceleration, the detection of any deviation from a purely gravity-driven trajectory requires high-quality astrometry over a long arc. As a result, non-gravitational effects have been measured on only a limited subset of the small-body population
9
. Here we report the detection, at 30
σ
significance, of non-gravitational acceleration in the motion of ‘Oumuamua. We analyse imaging data from extensive observations by ground-based and orbiting facilities. This analysis rules out systematic biases and shows that all astrometric data can be described once a non-gravitational component representing a heliocentric radial acceleration proportional to
r
−2
or
r
−1
(where
r
is the heliocentric distance) is included in the model. After ruling out solar-radiation pressure, drag- and friction-like forces, interaction with solar wind for a highly magnetized object, and geometric effects originating from ‘Oumuamua potentially being composed of several spatially separated bodies or having a pronounced offset between its photocentre and centre of mass, we find comet-like outgassing to be a physically viable explanation, provided that ‘Oumuamua has thermal properties similar to comets.
‘Oumuamua—the first known interstellar object to have entered the Solar System—is probably a comet, albeit with unusual dust and chemical properties owing to its origin in a distant solar system.
Journal Article
Building Grit: The Longitudinal Pathways between Mindset, Commitment, Grit, and Academic Outcomes
by
Tang, Xin
,
Wang, Ming-Te
,
Guo, Jiesi
in
Academic achievement
,
Academic Persistence
,
Adolescents
2019
Despite academics’ enthusiasm about the concept of grit (defined as consistency of interest and perseverance of effort), its benefit for academic achievement has recently been challenged. Drawing from a longitudinal sample (N = 2018; 55.3% female; sixth–nineth grades) from Finland, this study first aimed to investigate and replicate the association between grit and achievement outcomes (i.e., academic achievement and engagement). Further, the present study examined whether growth mindset and goal commitment impacted grit and whether grit acted as a mediator between growth mindset, goal commitment, and achievement outcomes. The results showed that the perseverance facet of grit in the eighth grade was associated with school achievement and engagement in the nineth grade, after controlling for students’ conscientiousness, academic persistence, prior achievement and engagement, gender and SES, although the effect on engagement was stronger than on achievement. In addition, grit was predicted by goal commitment in the sixth grade, but not by the growth mindset in the sixth grade. Finally, the perseverance of effort (not the consistency of interest) mediated the effect of goal commitment on engagement. These findings suggest that grit is associated with increased engagement and academic achievement; and practitioners who wish to improve grit of adolescents may encourage goal commitment more than growth mindset.
Journal Article