Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
2,576
result(s) for
"Probability interpretations"
Sort by:
A consistent test of independence based on a sign covariance related to Kendall's tau
2014
The most popular ways to test for independence of two ordinal random variables are by means of Kendall's tau and Spearman's rho. However, such tests are not consistent, only having power for alternatives with \"monotonie\" association. In this paper, we introduce a natural extension of Kendall's tau, called τ*, which is non-negative and zero if and only if independence holds, thus leading to a consistent independence test. Furthermore, normalization gives a rank correlation which can be used as a measure of dependence, taking values between zero and one. A comparison with alternative measures of dependence for ordinal random variables is given, and it is shown that, in a well-defined sense, τ* is the simplest, similarly to Kendall's tau being the simplest of ordinal measures of monotone association. Simulation studies show our test compares well with the alternatives in terms of average p-values.
Journal Article
Towards a human-like observer: applying deep learning in an extended Wigner’s friend experiment
by
Zhang, Xiao
,
Zeng, Jinjun
in
absolute of observed events
,
Artificial intelligence
,
Bell nonlocality
2025
There has been a longstanding demand for artificial intelligence (AI) with human-level cognitive sophistication to address loopholes in Bell-type experiments. In this study, we introduce a novel experimental framework that incorporates advanced deep learning techniques, utilizing neural network-based AI within an extended Wigner’s Friend experiment as a crucial step toward developing a human-like observer. We demonstrate the framework through simulations and introduce three new analytical metrics-morphing polygons, averaged Shannon entropy, and probability density maps-to evaluate the results. These results can be used to determine whether our AI qualifies as a bona fide observer and whether superposition applies to macroscopic systems, including observers.
Journal Article
Decoherence and its role in the modern measurement problem
2012
Decoherence is widely felt to have something to do with the quantum measurement problem, but getting clear on just what is made difficult by the fact that the 'measurement problem', as traditionally presented in foundational and philosophical discussions, has become somewhat disconnected from the conceptual problems posed by real physics. This, in turn, is because quantum mechanics as discussed in textbooks and in foundational discussions has become somewhat removed from scientific practice, especially where the analysis of measurement is concerned. This paper has two goals: firstly (§§1-2), to present an account of how quantum measurements are actually dealt with in modern physics (hint: it does not involve a collapse of the wave function) and to state the measurement problem from the perspective of that account; and secondly (§§3-4), to clarify what role decoherence plays in modern measurement theory and what effect it has on the various strategies that have been proposed to solve the measurement problem.
Journal Article
Evolution via Projection
by
Joshi, Mahendra
in
Classical and Quantum Gravitation
,
Classical Mechanics
,
Conditional probability
2023
The conditional probability interpretation of quantum gravity has been criticized for violating the constraints of the theory and also not giving the correct expression for the propagator. We have shown that following Page’s proposal of constructing an appropriate projector for the stationary state of a closed system, we can arrive at the correct expression for the propagator by using conditional probability rule. Also, it is shown that a unitary evolution of states of a subsystem at local level may be a consequence of non-unitary projection of appropriate states at global level.
Journal Article
DISCRETE-TIME PROBABILISTIC APPROXIMATION OF PATH-DEPENDENT STOCHASTIC CONTROL PROBLEMS
2014
We give a probabilistic interpretation of the Monte Carlo scheme proposed by Fahim, Touzi and Warin [Ann. Appi. Probab. 21 (2011) 1322-1364] for fully nonlinear parabolic PDEs, and hence generalize it to the pathdependent (or non-Markovian) case for a general stochastic control problem. A general convergence result is obtained by a weak convergence method in the spirit of Kushner and Dupuis [Numerical Methods for Stochastic Control Problems in Continuous Time (1992) Springer]. We also get a rate of convergence using the invariance principle technique as in Dolinsky [Electron. J. Probab. 17 (2012) 1-5], which is better than that obtained by viscosity solution method. Finally, by approximating the conditional expectations arising in the numerical scheme with simulation-regression method, we obtain an implementable scheme.
Journal Article
Temperature-based death time estimation in near equilibrium: Asymptotic confidence interval estimation
by
Mall, Gita
,
Muggenthaler, Holger
,
Hubig, Michael
in
Ambient temperature
,
Bias
,
Body temperature
2018
•In homicide cases: temperature based death time estimation: method of Marshal & Hoare and Henßge (MHH).•Method VBT of Potente et al. gives confidence intervals (CI) near temperature equilibrium using MHH.•We show: VBT has an upper applicability limit in time since death.•Beyond the limit CI-probabilities can be dramatically underestimated by VBT.•Reason VBT missed the limit: conventional CI-interpretation instead of frequentist interpretation.
Temperature based death time estimation (TDE) is severely limited in situations where body core temperature has almost decreased to ambient temperature. The TDE method of Marshall/Hoare and Henßge (MHH) defines a lower bound TK for body core temperature below which the time p.m. should be stated to be >10h only.
A recent study (Potente et al., 2017 [10]) established a new method, called variance-bias-tradeoff (VBT) complementing MHH in constructing a right-side-half-infinite 97.5%-confidence interval for such ‘near equilibrium’-situations. It seemingly proved the validity for all body core temperatures T
Journal Article
Conceptualising morally permissible risk imposition without quantified individual risks
2022
We frequently engage in activities that impose a risk of serious harm on innocent others in order to realise trivial benefits for ourselves or third parties. Many moral theories tie the evidence-relative permissibility of engaging in such activities to the size of the risk that an individual agent imposes. I argue that we should move away from such a reliance on quantified individual risks when conceptualising morally permissible risk imposition. Under most circumstances of interest, a conscientious reasoner will identify a gap between the factors they deem potentially relevant to the riskiness of an agent’s behaviour, and the factors they are reasonably able to quantify. This then leads a conscientious reasoner to conclude that they cannot, in good faith, come up with a quantitative risk estimate that is genuinely tailored to the agent’s particular situation. Based on this, I argue that principles of morally permissible risk imposition fail to provide us with practical guidance if they ask us to take into account our agent-specific risks in a quantified manner. I also argue that principles of permissible risk imposition which appeal to quantified individual risks implausibly imply that it is frequently indeterminate whether engaging in some risky activity is morally permissible. For both of these reasons, I contend that principles of morally permissible risk imposition should make no reference to quantified individual risks. They should instead acknowledge that any quantitative estimates that an agent might usefully be able to consider will likely not be tailored to the agent’s idiosyncratic situation.
Journal Article
Subjectivity of pre-test probability value: controversies over the use of Bayes’ Theorem in medical diagnosis
2023
This article discusses the use of Bayes’ Theorem in medical diagnosis with a view to examining the epistemological problems of interpreting the concept of pre-test probability value. It is generally maintained that pre-test probability values are determined subjectively. Accordingly, this paper investigates three main philosophical interpretations of probability (the “classic” one, based on the principle of non-sufficient reason, the frequentist one, and the personalistic one). This study argues that using Bayes’ Theorem in medical diagnosis does not require accepting the radical personalistic interpretation. It will be shown that what distinguishes radical and moderate personalist interpretations is the criterion of conditional inter-subjectivity which applies only to the moderate account of personalist interpretation.
Journal Article
Optimal estimator of hypothesis probability for data mining problems with small samples
by
Piegat, Andrzej
,
Landowski, Marek
in
completeness interpretation of probability
,
frequency interpretation of probability
,
probability
2012
The paper presents a new (to the best of the authors’ knowledge) estimator of probability called the “Eph √ 2 completeness estimator” along with a theoretical derivation of its optimality. The estimator is especially suitable for a small number of sample items, which is the feature of many real problems characterized by data insufficiency. The control parameter of the estimator is not assumed in an a priori, subjective way, but was determined on the basis of an optimization criterion (the least absolute errors).The estimator was compared with the universally used frequency estimator of probability and with Cestnik’s m-estimator with respect to accuracy. The comparison was realized both theoretically and experimentally. The results show the superiority of the Eph √ 2 completeness estimator over the frequency estimator for the probability interval p
∈ (0.1, 0.9). The frequency estimator is better for p
∈ [0, 0.1] and p
∈ [0.9, 1].
Journal Article
Renewal Redundant Systems Under the Marshall–Olkin Failure Model. A Probability Analysis
by
Rykov, Vladimir
,
Dimitrov, Boyan
,
Milovanova, Tatiana
in
Failure
,
Failure analysis
,
lst and pgf probability interpretation
2020
In this paper a two component redundant renewable system operating under the Marshall–Olkin failure model is considered. The purpose of the study is to find analytical expressions for the time dependent and the steady state characteristics of the system. The system cycle process characteristics are analyzed by the use of probability interpretation of the Laplace–Stieltjes transformations (LSTs), and of probability generating functions (PGFs). In this way the long mathematical analytic derivations are avoid. As results of the investigations, the main reliability characteristics of the system—the reliability function and the steady state probabilities—have been found in analytical form. Our approach can be used in the studies of various applications of systems with dependent failures between their elements.
Journal Article
This website uses cookies to ensure you get the best experience on our website.