Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
9,325 result(s) for "Heavens, Alan"
Sort by:
Generalisations of Fisher Matrices
Fisher matrices play an important role in experimental design and in data analysis. Their primary role is to make predictions for the inference of model parameters—both their errors and covariances. In this short review, I outline a number of extensions to the simple Fisher matrix formalism, covering a number of recent developments in the field. These are: (a) situations where the data (in the form of ( x , y ) pairs) have errors in both x and y; (b) modifications to parameter inference in the presence of systematic errors, or through fixing the values of some model parameters; (c) Derivative Approximation for LIkelihoods (DALI) - higher-order expansions of the likelihood surface, going beyond the Gaussian shape approximation; (d) extensions of the Fisher-like formalism, to treat model selection problems with Bayesian evidence.
The star-formation history of the Universe from the stellar populations of nearby galaxies
The determination of the star-formation history of the Universe is a key goal of modern cosmology, as it is crucial to our understanding of how galactic structures form and evolve. Observations 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 of young stars in distant galaxies at different times in the past have indicated that the stellar birthrate peaked some eight billion years ago before declining by a factor of around ten to its present value. Here we report an analysis of the ‘fossil record’ of the current stellar populations of 96,545 nearby galaxies, from which we obtained a complete star-formation history. Our results broadly support those derived from high-redshift galaxies. We find, however, that the peak of star formation was more recent—around five billion years ago. We also show that the bigger the stellar mass of the galaxy, the earlier the stars were formed, which indicates that high- and low-mass galaxies have very different histories.
Is time travel possible?
Time travel is a staple of popular fiction and culture, and as with many ideas that make their way into popular culture, there is some basis in real physics, but is it a fanciful extrapolation or a feasible possibility? This paper explores some notions of time and time travel from a physicist’s perspective.
Handbook for the GREAT08 Challenge: An Image Analysis Competition for Cosmological Lensing
The GRavitational lEnsing Accuracy Testing 2008 (GREAT08) Challenge focuses on a problem that is of crucial importance for future observations in cosmology. The shapes of distant galaxies can be used to determine the properties of dark energy and the nature of gravity, because light from those galaxies is bent by gravity from the intervening dark matter. The observed galaxy images appear distorted, although only slightly, and their shapes must be precisely disentangled from the effects of pixelisation, convolution and noise. The worldwide gravitational lensing community has made significant progress in techniques to measure these distortions via the Shear TEsting Program (STEP). Via STEP, we have run challenges within our own community, and come to recognise that this particular image analysis problem is ideally matched to experts in statistical inference, inverse problems and computational learning. Thus, in order to continue the progress seen in recent years, we are seeking an infusion of new ideas from these communities. This document details the GREAT08 Challenge for potential participants. Please visit www.great08challenge.info for the latest information.
Cosmology and fundamental physics with the Euclid satellite
Euclid is a European Space Agency medium-class mission selected for launch in 2020 within the cosmic vision 2015–2025 program. The main goal of Euclid is to understand the origin of the accelerated expansion of the universe. Euclid will explore the expansion history of the universe and the evolution of cosmic structures by measuring shapes and red-shifts of galaxies as well as the distribution of clusters of galaxies over a large fraction of the sky. Although the main driver for Euclid is the nature of dark energy, Euclid science covers a vast range of topics, from cosmology to galaxy evolution to planetary research. In this review we focus on cosmology and fundamental physics, with a strong emphasis on science beyond the current standard models. We discuss five broad topics: dark energy and modified gravity, dark matter, initial conditions, basic assumptions and questions of methodology in the data analysis. This review has been planned and carried out within Euclid’s Theory Working Group and is meant to provide a guide to the scientific themes that will underlie the activity of the group during the preparation of the Euclid mission.
GRAVITATIONAL LENSING ACCURACY TESTING 2010 (GREAT10) CHALLENGE HANDBOOK
GRavitational lEnsing Accuracy Testing 2010 (GREAT10) is a public image analysis challenge aimed at the development of algorithms to analyze astronomical images. Specifically, the challenge is to measure varying image distortions in the presence of a variable convolution kernel, pixelization and noise. This is the second in a series of challenges set to the astronomy, computer science and statistics communities, providing a structured environment in which methods can be improved and tested in preparation for planned astronomical surveys. GREAT10 extends upon previous work by introducing variable fields into the challenge. The \"Galaxy Challenge\" involves the precise measurement of galaxy shape distortions, quantified locally by two parameters called shear, in the presence of a known convolution kernel. Crucially, the convolution kernel and the simulated gravitational lensing shape distortion both now vary as a function of position within the images, as is the case for real data. In addition, we introduce the \"Star Challenge\" that concerns the reconstruction of a variable convolution kernel, similar to that in a typical astronomical observation. This document details the GREAT10 Challenge for potential participants. Continually updated information is also available from www.greatchallenges.info.
Geometry of the Universe
A neat way of measuring the geometry of the Universe offers a new test of the standard cosmological model. It probes, among other things, the elusive dark energy thought to be driving the Universe's expansion. See Letter p.539 Dark energy passes the test In a Letter to Nature in October 1979, Charles Alcock and Bohdan Paczynski ( http://go.nature.com/EHsEDM ) proposed a strategy to solve one of cosmology's most recalcitrant problems: measuring the curvature of the Universe using a model-independent, purely geometrical, approach. Until now, this idea — which would put a value on the dark energy component thought to act to oppose gravity — has never been substantiated. With distant galaxy pairs as test objects, Christian Marinoni and Adeline Buzzi report the successful implementation of the Alcock–Paczynski test. Their analysis of the symmetry of distant galaxy pairs from archival data allows them to determine that the Universe is flat, and by alternately fixing its spatial geometry and the dark energy equation-of-state parameter, they establish new measures for both the abundance and the equation of state of dark energy.
On the accuracy and precision of correlation functions and field-level inference in cosmology
We present a comparative study of the accuracy and precision of correlation function methods and full-field inference in cosmological data analysis. To do so, we examine a Bayesian hierarchical model that predicts log-normal fields and their two-point correlation function. Although a simplified analytic model, the log-normal model produces fields that share many of the essential characteristics of the present-day non-Gaussian cosmological density fields. We use three different statistical techniques: (i) a standard likelihood-based analysis of the two-point correlation function; (ii) a likelihood-free (simulation-based) analysis of the two-point correlation function; (iii) a field-level analysis, made possible by the more sophisticated data assimilation technique. We find that (a) standard assumptions made to write down a likelihood for correlation functions can cause significant biases, a problem that is alleviated with simulation-based inference; and (b) analysing the entire field offers considerable advantages over correlation functions, through higher accuracy, higher precision, or both. The gains depend on the degree of non-Gaussianity, but in all cases, including for weak non-Gaussianity, the advantage of analysing the full field is substantial.