Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
375 result(s) for "Saunders, Simon"
Sort by:
The Gibbs Paradox
The Gibbs Paradox is essentially a set of open questions as to how sameness of gases or fluids (or masses, more generally) are to be treated in thermodynamics and statistical mechanics. They have a variety of answers, some restricted to quantum theory (there is no classical solution), some to classical theory (the quantum case is different). The solution offered here applies to both in equal measure, and is based on the concept of particle indistinguishability (in the classical case, Gibbs’ notion of ‘generic phase’). Correctly understood, it is the elimination of sequence position as a labelling device, where sequences enter at the level of the tensor (or Cartesian) product of one-particle state spaces. In both cases it amounts to passing to the quotient space under permutations. ‘Distinguishability’, in the sense in which it is usually used in classical statistical mechanics, is a mathematically convenient, but physically muddled, fiction.
Branch-counting in the Everett interpretation of quantum mechanics
A defence is offered of a version of the branch-counting rule in the Everett interpretation (otherwise known as many worlds interpretation) of quantum mechanics that both depends on the state and is continuous in the norm topology on Hilbert space. The well-known branch-counting rule, for realistic models of measurements, in which branches are defined by decoherence theory, fails this test. The new rule hinges on the use of decoherence theory in defining branching structure, and specifically decoherent histories theory. On this basis ratios of branch numbers are defined, free of any convention. They agree with the Born rule and deliver a notion of objective probability similar to naive frequentism, save that the frequencies of outcomes are not confined to a single world at different times, but spread over worlds at a single time. Nor is it ad hoc: it is recognizably akin to the combinatorial approach to thermodynamic probability, as introduced by Boltzmann in 1879. It is identical to the procedure followed by Planck, Bose, Einstein and Dirac in defining the equilibrium distribution of the Bose–Einstein gas. It also connects in a simple way with the decision-theory approach to quantum probability.
Physical Probability and Locality in no-collapse quantum theory
Probability is distinguished into two kinds: physical and epistemic, also, but less accurately, called objective and subjective. Simple postulates are given for physical probability, the only novel one being a locality condition. Translated into no-collapse quantum mechanics, without hidden variables, the postulates imply that the elements in any equiamplitude expansion of the quantum state are equiprobable. Such expansions therefore provide ensembles of microstates that can be used to define probabilities in the manner of frequentism, in von Mises’ sense (where the probability of P is the frequency of occurrence of P in a suitable ensemble). The result is the Born rule. Since satisfying our postulates, and in particular the locality condition (meaning no action-at-a-distance), these probabilities for no-collapse quantum mechanics are perfectly local, even though they violate Bell inequalities. The latter can be traced to a violation of outcome independence, used to derive the inequalities. But in no-collapse theory it is not a locality condition; it is a criterion for entanglement, not locality.
Derivation of the Born rule from operational assumptions
The Born rule is derived from operational assumptions, together with assumptions of quantum mechanics that concern only the deterministic development of the state. Unlike Gleason's theorem, the argument applies even if probabilities are defined for only a single resolution of the identity, so it applies to a variety of foundational approaches to quantum mechanics. It also provides a probability rule for state spaces that are not Hilbert spaces.
Rethinking Newton's Principia
It is widely accepted that the notion of an inertial frame is central to Newtonian mechanics and that the correct space-time structure underlying Newton's methods in Principia is neo-Newtonian or Galilean space-time. I argue to the contrary that inertial frames are not needed in Newton's theory of motion, and that the right space-time structure for Newton's Principia requires the notion of parallelism of spatial directions at different times and nothing more.
Rethinking Newton’sPrincipia
It is widely accepted that the notion of an inertial frame is central to Newtonian mechanics and that the correct space-time structure underlying Newton’s methods inPrincipiais neo-Newtonian or Galilean space-time. I argue to the contrary that inertial frames are not needed in Newton’s theory of motion, and that the right space-time structure for Newton’sPrincipiarequires the notion of parallelism of spatial directions at different times and nothing more.
The Negative Energy Sea
The Dirac negative energy sea introduced the concept of antimatter, and explained it, not least in its relationship to negative-energy solutions to the wave equation. Post-war, it was largely displaced by what I shall call the 'standard formalism', dependent, among other things, on normal-ordering. A much better explanation is provided by the 'two complex structures' viewpoint, as first introduced by Irving Segal: the one ('natural') kind of complex numbers at the level of covariant, local fields; and the other ('particle') complex numbers at the level of the one-particle Hilbert space and Fock space. The former is local, the latter non-local: therein lies the fundamental difference between relativistic and non-relativistic quantum theory.
Femtocell Networks
[...] the accuracy of the resulting analysis on the SINR PDF is validated in the paper via Monte Carlo simulations. [...] in \"Extremal signal quality in cellular networks: asymptotic properties and applications to mobility management in small cell networks\", V. M. Nguyen et al. investigate the critical issues of extremal signal quality and mobility management in small-cell networks accounting for the characteristics of high density and randomness of small cells.