Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
263,710 result(s) for "data analysis method"
Sort by:
Selection and processing of calibration samples to measure the particle identification performance of the LHCb experiment in Run 2
Since 2015, with the restart of the LHC for its second run of data taking, the LHCb experiment has been empowered with a dedicated computing model to select and analyse calibration samples to measure the performance of the particle identification (PID) detectors and algorithms. The novel technique was developed within the framework of the innovative trigger model of the LHCb experiment, which relies on online event reconstruction for most of the datasets, reserving offline reconstruction to special physics cases. The strategy to select and process the calibration samples, which includes a dedicated data-processing scheme combining online and offline reconstruction, is discussed. The use of the calibration samples to measure the detector PID performance, and the efficiency of PID requirements across a large range of decay channels, is described. Applications of the calibration samples in data-quality monitoring and validation procedures are also detailed.
Investigating the impact of smart manufacturing on firms' operational and financial performance
PurposeSmart manufacturing (SM) lies at the core of Industry 4.0. Uniform adoption of SM across business partners is crucial to exploit its value creation potential. However, firms' willingness to invest in SM is limited by insufficient or inconclusive evidence on its performance-related benefits. To close this gap, this paper develops and tests a model linking SM adoption to firms' financial performance. Improvements along the four dimensions of operational performance (i.e. cost quality, delivery and flexibility) mediate this relation.Design/methodology/approachThis study follows an empirical research approach. In particular, survey data from 234 automotive component suppliers are analyzed via covariance-based structural equation modeling to explore the link between SM adoption and operational performance. Survey data are then matched with secondary data from balance sheets of 81 firms to investigate the impact of SM on financial performance via partial least square structural equation modeling.FindingsFindings highlight that adoption of SM results in improvements in cost, quality, delivery performance, thus suggesting that SM is a mean to overcome performance trade-offs. Improvements in operational performance enabled by SM do not give rise to superior financial performance, thus implying that SM might support firms in maintaining the competitive position in the market, but could be insufficient to generate higher margin.Originality/valueResults have implications for SM research and for manufacturing executives engaged in the adoption of SM, as they provide a detailed analysis of the impact of SM on operational performance and clarify the effect that SM adoption has on financial performance.
Characterization of Single-mode Fiber Coupling at the Large Binocular Telescope
Optimizing on-sky single-mode fiber (SMF) injection is an essential part of developing precise Doppler spectrometers and new astrophotonics technologies. We installed and tested a prototype SMF-injection system at the Large Binocular Telescope in 2016 April. The fiber injection unit was built as part of the derisking process for a new instrument named iLocater that will use adaptive optics (AO) to feed a high resolution, near-infrared spectrograph. In this paper we report Y-band SMF coupling measurements for bright, M-type stars. We compare theoretical expectations for delivered Strehl ratio and SMF coupling to experimental results, and evaluate fundamental effects that limit injection efficiency. We find the pupil geometry of the telescope itself limits fiber coupling to a maximum efficiency of tel 0.78. Further analysis shows the individual impact of AO correction, tip-tilt residuals, and static (noncommon-path) aberrations contribute coupling coefficients of Strehl 0.33, tip tilt 0.84 , and ncpa 0.8 respectively. Combined, these effects resulted in an average Y-band SMF efficiency of 0.18 for all observations. Finally, we investigate the impact of fiber coupling on radial velocity precision as a function of stellar apparent magnitude.
SiMon: Simulation Monitor for Computational Astrophysics
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.
How populist attitudes scales fail to capture support for populists in power
Populist attitudes are generally measured in surveys through three necessary and non-compensatory elements of populism, namely anti-elitism, people-centrism, and Manicheanism. Using Comparative Study of Electoral Systems Module 5 (2016–2020) data for 30 countries, we evaluate whether this approach explains voting for populist parties across countries in Asia, Europe and the Americas. We show that the existing scales of populist attitudes effectively explain voting for populists in countries where populist leaders and parties are in opposition but fail to explain voting for populist parties in countries where they are in power . We argue that current approaches assume “the elite” to mean “politicians”, thus failing to capture attitudes towards “non-political elites” often targeted by populists in office—in particular, journalists, academics/experts, bureaucrats, and corporate business leaders. The results reveal limits to the usefulness of existing survey batteries in cross-national studies of populism and emphasize the need to develop approaches that are more generalizable across political and national contexts.
Long-term Periodicities of Cataclysmic Variables with Synoptic Surveys
A systematic study on the long-term periodicities of known Galactic cataclysmic variables (CVs) was conducted. Among 1580 known CVs, 344 sources were matched and extracted from the Palomar Transient Factory (PTF) data repository. The PTF light curves were combined with the Catalina Real-Time Transient Survey (CRTS) light curves and analyzed. Ten targets were found to exhibit long-term periodic variability, which is not frequently observed in the CV systems. These long-term variations are possibly caused by various mechanisms, such as the precession of the accretion disk, hierarchical triple star system, magnetic field change of the companion star, and other possible mechanisms. We discuss the possible mechanisms in this study. If the long-term period is less than several tens of days, the disk precession period scenario is favored. However, the hierarchical triple star system or the variations in magnetic field strengths are most likely the predominant mechanisms for longer periods.
A multiphase study of classical Cepheids in the Magellanic Clouds- Models and Observations
This work presents the study of multiphase relations of classical Cepheids in the Magellanic Clouds for short periods (log P < 1) and long periods (log P > 1). From the analysis, it has been found that the multiphase relations obtained using the models as well as observations are highly dynamic with pulsational phase. The multiphase relations for short and long periods are found to display contrasting behaviour for both LMC and SMC. It has been observed that the multiphase relations obtained using the models agree better with the observations in the PC plane in most phases in comparison to the PL plane. Multiphase relations obtained using the models display a clear distinction among different convection sets in most phases. Comparison of models and observations in the multiphase plane is one way to test the models with the observations and to constrain the theory of stellar pulsation.
Fully Automated Reduction of Longslit Spectroscopy with the Low Resolution Imaging Spectrometer at the Keck Observatory
This paper presents and summarizes a software package (\"LPipe\") for completely automated, end-to-end reduction of both bright and faint sources with the Low Resolution Imaging Spectrometer (LRIS) at Keck Observatory. It supports all gratings, grisms, and dichroics, and also reduces imaging observations, although it does not include multislit or polarimetric reduction capabilities at present. It is suitable for on-the-fly quicklook reductions at the telescope, for large-scale reductions of archival data sets, and (in many cases) for science-quality post-run reductions of PI data. To demonstrate its capabilities the pipeline is run in fully automated mode on all LRIS longslit data in the Keck Observatory Archive acquired during the 12-month period between 2016 August and 2017 July. The reduced spectra (of 675 single-object targets, totaling ∼200 hours of on-source integration time in each camera), and the pipeline itself, are made publicly available to the community.