Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
665,958
result(s) for
"Technology - statistics "
Sort by:
Automatic hoof-on and -off detection in horses using hoof-mounted inertial measurement unit sensors
2020
For gait classification, hoof-on and hoof-off events are fundamental locomotion characteristics of interest. These events can be measured with inertial measurement units (IMUs) which measure the acceleration and angular velocity in three directions. The aim of this study was to present two algorithms for automatic detection of hoof-events from the acceleration and angular velocity signals measured by hoof-mounted IMUs in walk and trot on a hard surface. Seven Warmblood horses were equipped with two wireless IMUs, which were attached to the lateral wall of the right front (RF) and hind (RH) hooves. Horses were walked and trotted on a lead over a force plate for internal validation. The agreement between the algorithms for the acceleration and angular velocity signals with the force plate was evaluated by Bland Altman analysis and linear mixed model analysis. These analyses were performed for both hoof-on and hoof-off detection and for both algorithms separately. For the hoof-on detection, the angular velocity algorithm was the most accurate with an accuracy between 2.39 and 12.22 ms and a precision of around 13.80 ms, depending on gait and hoof. For hoof-off detection, the acceleration algorithm was the most accurate with an accuracy of 3.20 ms and precision of 6.39 ms, independent of gait and hoof. These algorithms look highly promising for gait classification purposes although the applicability of these algorithms should be investigated under different circumstances, such as different surfaces and different hoof trimming conditions.
Journal Article
Thinking through data : how outliers, aggregates, and patterns shape perception
by
Bak Herrie, Maja author
in
Statistics and aesthetics
,
Aesthetics, Modern 21st century
,
Art and technology
2025
\"We encounter digital data processing on a range of platforms and in a multitude of contexts today: in the predictive algorithms of the financial sector, in drones, insurance, and risk management, in smart cities, biometrics, medicine, and more. This fascinating book explores the historical context of the current data-driven paradigm and explains how elusive yet crucial statistical concepts such as outliers, aggregates, and patterns form how we sense and make sense of data. From the 16th century's embodied measurements of the foot, through the blurred facial features of L'Homme Moyen, to the image aggregates of today's security systems, the examples collected in this book illustrate the central role of aesthetics throughout the history of statistical knowledge production. Taking its point of departure in analyses and discussions of contemporary artistic experiments by Rossella Biscotti, Stéphanie Solinas, and Adam Broomberg and Oliver Chanarin, the book broadens our understanding of the structures of knowledge and methods in statistical computation beyond optimistic narratives of calculative power. Venturing out into the tails of the distributions--to the systemically overlooked and excluded--this book challenges us to embrace an alternative view of room of modern data processing\"-- Provided by publisher.
Papers and patents are becoming less disruptive over time
2023
Theories of scientific and technological change view discovery and invention as endogenous processes
1
,
2
, wherein previous accumulated knowledge enables future progress by allowing researchers to, in Newton’s words, ‘stand on the shoulders of giants’
3
,
4
,
5
,
6
–
7
. Recent decades have witnessed exponential growth in the volume of new scientific and technological knowledge, thereby creating conditions that should be ripe for major advances
8
,
9
. Yet contrary to this view, studies suggest that progress is slowing in several major fields
10
,
11
. Here, we analyse these claims at scale across six decades, using data on 45 million papers and 3.9 million patents from six large-scale datasets, together with a new quantitative metric—the CD index
12
—that characterizes how papers and patents change networks of citations in science and technology. We find that papers and patents are increasingly less likely to break with the past in ways that push science and technology in new directions. This pattern holds universally across fields and is robust across multiple different citation- and text-based metrics
1
,
13
,
14
,
15
,
16
–
17
. Subsequently, we link this decline in disruptiveness to a narrowing in the use of previous knowledge, allowing us to reconcile the patterns we observe with the ‘shoulders of giants’ view. We find that the observed declines are unlikely to be driven by changes in the quality of published science, citation practices or field-specific factors. Overall, our results suggest that slowing rates of disruption may reflect a fundamental shift in the nature of science and technology.
A decline in disruptive science and technology over time is reported, representing a substantive shift in science and technology, which is attributed in part to the reliance on a narrower set of existing knowledge.
Journal Article
Arab knowledge report ...
by
مؤسسة محمد بن راشد آل مكتوم author
,
United Nations Development Programme. Regional Bureau for Arab States author
in
Information society United Arab Emirates Statistics Periodicals
,
Quality of life United Arab Emirates Statistics Periodicals
,
Information technology United Arab Emirates Statistics Periodicals
Periodical
Sensing Coverage Prediction for Wireless Sensor Networks in Shadowed and Multipath Environment
by
Lobiyal, D. K.
,
Kumar, Sushil
in
Comparative studies
,
Computer Communication Networks - statistics & numerical data
,
Computer science
2013
Sensing coverage problem in wireless sensor networks is a measure of quality of service (QoS). Coverage refers to how well a sensing field is monitored or tracked by the sensors. Aim of the paper is to have a priori estimate for number of sensors to be deployed in a harsh environment to achieve desired coverage. We have proposed a new sensing channel model that considers combined impact of shadowing fading and multipath effects. A mathematical model for calculating coverage probability in the presence of multipath fading combined with shadowing is derived based on received signal strength (RSS). Further, the coverage probability derivations obtained using Rayleigh fading and lognormal shadowing fading are validated by node deployment using Poisson distribution. A comparative study between our proposed sensing channel model and different existing sensing models for the network coverage has also been presented. Our proposed sensing model is more suitable for realistic environment since it determines the optimum number of sensors required for desirable coverage in fading conditions.
Journal Article
Electromagnetics, control and robotics : a problems & solutions approach
This book covers a variety of problems, and offers solutions to some, in: Statistical state and parameter estimation in nonlinear stochastic dynamical system in both the classical and quantum scenarios Propagation of electromagnetic waves in a plasma as described by the Boltzmann Kinetic Transport Equation Classical and Quantum General Relativity It will be of use to Engineering undergraduate students interested in analysing the motion of robots subject to random perturbation, and also to research scientists working in Quantum Filtering.
Gender differences in individual variation in academic grades fail to fit expected patterns for STEM
2018
Fewer women than men pursue careers in science, technology, engineering and mathematics (STEM), despite girls outperforming boys at school in the relevant subjects. According to the ‘variability hypothesis’, this over-representation of males is driven by gender differences in variance; greater male variability leads to greater numbers of men who exceed the performance threshold. Here, we use recent meta-analytic advances to compare gender differences in academic grades from over 1.6 million students. In line with previous studies we find strong evidence for lower variation among girls than boys, and of higher average grades for girls. However, the gender differences in both mean and variance of grades are smaller in STEM than non-STEM subjects, suggesting that greater variability is insufficient to explain male over-representation in STEM. Simulations of these differences suggest the top 10% of a class contains equal numbers of girls and boys in STEM, but more girls in non-STEM subjects.
Men are over-represented in the STEM (science, technology, engineering and mathematics) workforce even though girls outperform boys in these subjects at school. Here, the authors cast doubt on one leading explanation for this paradox, the ‘variability hypothesis’.
Journal Article
Regularization, optimization, kernels, and support vector machines
\"Obtaining reliable models from given data is becoming increasingly important in a wide range of different applications fields including the prediction of energy consumption, complex networks, environmental modelling, biomedicine, bioinformatics, finance, process modelling, image and signal processing, brain-computer interfaces, and others. In data-driven modelling approaches one has witnessed considerable progress in the understanding of estimating flexible nonlinear models, learning and generalization aspects, optimization methods, and structured modelling. One area of high impact both in theory and applications is kernel methods and support vector machines. Optimization problems, learning, and representations of models are key ingredients in these methods. On the other hand, considerable progress has also been made on regularization of parametric models, including methods for compressed sensing and sparsity, where convex optimization plays an important role. At the international workshop ROKS 2013 Leuven, 1 July 8-10, 2013, researchers from diverse fields were meeting on the theory and applications of regularization, optimization, kernels, and support vector machines. At this occasion the present book has been edited as a follow-up to this event, with a variety of invited contributions from presenters and scientific committee members. It is a collection of recent progress and advanced contributions on these topics, addressing methods including ...\"-- Provided by publisher.
Complex economic activities concentrate in large cities
by
Jara-Figueroa, Cristian
,
Hidalgo, César A.
,
Petralia, Sergio G.
in
4014/159
,
4014/2808
,
4014/4001
2020
Human activities, such as research, innovation and industry, concentrate disproportionately in large cities. The ten most innovative cities in the United States account for 23% of the national population, but for 48% of its patents and 33% of its gross domestic product. But why has human activity become increasingly concentrated? Here we use data on scientific papers, patents, employment and gross domestic product, for 353 metropolitan areas in the United States, to show that the spatial concentration of productive activities increases with their complexity. Complex economic activities, such as biotechnology, neurobiology and semiconductors, concentrate disproportionately in a few large cities compared to less--complex activities, such as apparel or paper manufacturing. We use multiple proxies to measure the complexity of activities, finding that complexity explains from 40% to 80% of the variance in urban concentration of occupations, industries, scientific fields and technologies. Using historical patent data, we show that the spatial concentration of cutting-edge technologies has increased since 1850, suggesting a reinforcing cycle between the increase in the complexity of activities and urbanization. These findings suggest that the growth of spatial inequality may be connected to the increasing complexity of the economy.
Balland et al. use data on scientific papers, patents, employment and GDP for 353 metropolitan areas in the United States to show that economic complexity drives the spatial concentration of productive activities in large cities.
Journal Article