Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
26 result(s) for "Tools for Experiment and Theory - Experimental Physics"
Sort by:
A facility for pion-induced nuclear reaction studies with HADES
. The combination of a production target for secondary beams, an optimized ion optical beam line setting, in-beam detectors for minimum ionizing particles with high rate capability, and an efficient large acceptance spectrometer around the reaction target constitutes an experimental opportunity to study in detail hadronic interactions utilizing pion beams impinging on nucleons and nuclei. For the 0.4-2.0GeV/c pion momentum regime such a facility is located at the heavy ion synchrotron accelerator SIS18 in Darmstadt (Germany). The layout of the apparatus, performance of its components and encouraging results from a first commissioning run are presented.
Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora
The Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/ c charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1 ± 0.6 % and 84.1 ± 0.6 %, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation.
The ATLAS Simulation Infrastructure
The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, and the validation of the simulated output against known physics processes.
HistFitter software framework for statistical data analysis
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton–proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.
Performance of the LHCb RICH detector at the LHC
The LHCb experiment has been taking data at the Large Hadron Collider (LHC) at CERN since the end of 2009. One of its key detector components is the Ring-Imaging Cherenkov (RICH) system. This provides charged particle identification over a wide momentum range, from 2–100 GeV/ c . The operation and control, software, and online monitoring of the RICH system are described. The particle identification performance is presented, as measured using data from the LHC. Excellent separation of hadronic particle types ( π , K, p) is achieved.
Analysis of cryogenic calorimeters with light and heat read-out for double beta decay searches
The suppression of spurious events in the region of interest for neutrinoless double beta decay will play a major role in next generation experiments. The background of detectors based on the technology of cryogenic calorimeters is expected to be dominated by \\[\\alpha \\] particles, that could be disentangled from double beta decay signals by exploiting the difference in the emission of the scintillation light. CUPID-0, an array of enriched Zn\\[^{82}\\]Se scintillating calorimeters, is the first large mass demonstrator of this technology. The detector started data-taking in 2017 at the Laboratori Nazionali del Gran Sasso with the aim of proving that dual read-out of light and heat allows for an efficient suppression of the \\[\\alpha \\] background. In this paper we describe the software tools we developed for the analysis of scintillating calorimeters and we demonstrate that this technology allows to reach an unprecedented background for cryogenic calorimeters.
Reconstruction of physics objects at the Circular Electron Positron Collider with Arbor
After the Higgs discovery, precise measurements of the Higgs properties and the electroweak observables become vital for the experimental particle physics. A powerful Higgs/Z factory, the Circular Electron Positron Collider (CEPC) is proposed. The Particle Flow oriented detector design is proposed to the CEPC and a Particle Flow algorithm, Arbor is optimized accordingly. We summarize the physics object reconstruction performance of the Particle Flow oriented detector design with Arbor algorithm and conclude that this combination fulfills the physics requirement of CEPC.
CMS tracking performance results from early LHC operation
The first LHC pp collisions at centre-of-mass energies of 0.9 and 2.36 TeV were recorded by the CMS detector in December 2009. The trajectories of charged particles produced in the collisions were reconstructed using the all-silicon Tracker and their momenta were measured in the 3.8 T axial magnetic field. Results from the Tracker commissioning are presented including studies of timing, efficiency, signal-to-noise, resolution, and ionization energy. Reconstructed tracks are used to benchmark the performance in terms of track and vertex resolutions, reconstruction of decays, estimation of ionization energy loss, as well as identification of photon conversions, nuclear interactions, and heavy-flavour decays.
A high-resolution pixel silicon Vertex Detector for open charm measurements with the NA61/SHINE spectrometer at the CERN SPS
The study of open charm meson production provides an efficient tool for the investigation of the properties of hot and dense matter formed in nucleus–nucleus collisions. The interpretation of the existing di-muon data from the CERN SPS suffers from a lack of knowledge on the mechanism and properties of the open charm particle production. Due to this, the heavy-ion programme of the NA61/SHINE experiment at the CERN SPS has been extended by precise measurements of charm hadrons with short lifetimes. A new Vertex Detector for measurements of the rare processes of open charm production in nucleus–nucleus collisions was designed to meet the challenges of track registration and high resolution in primary and secondary vertex reconstruction. A small-acceptance version of the vertex detector was installed in 2016 and tested with Pb + Pb collisions at 150 A GeV / c . It was also operating during the physics data taking on Xe + La and Pb + Pb collisions at 150 A GeV / c conducted in 2017 and 2018. This paper presents the detector design and construction, data calibration, event reconstruction, and analysis procedure.
Readiness of the ATLAS Tile Calorimeter for LHC collisions
The Tile hadronic calorimeter of the ATLAS detector has undergone extensive testing in the experimental hall since its installation in late 2005. The readout, control and calibration systems have been fully operational since 2007 and the detector has successfully collected data from the LHC single beams in 2008 and first collisions in 2009. This paper gives an overview of the Tile Calorimeter performance as measured using random triggers, calibration data, data from cosmic ray muons and single beam data. The detector operation status, noise characteristics and performance of the calibration systems are presented, as well as the validation of the timing and energy calibration carried out with minimum ionising cosmic ray muons data. The calibration systems’ precision is well below the design value of 1%. The determination of the global energy scale was performed with an uncertainty of 4%.