Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
4,926
result(s) for
"Astronomy software"
Sort by:
CASA, the Common Astronomy Software Applications for Radio Astronomy
by
Castro, Sandra
,
Sekhar, Srikrishna
,
Griffith, Morgan
in
Aperture synthesis
,
Astronomy
,
Astronomy data analysis
2022
CASA, the Common Astronomy Software Applications, is the primary data processing software for the Atacama Large Millimeter/submillimeter Array (ALMA) and the Karl G. Jansky Very Large Array (VLA), and is frequently used also for other radio telescopes. The CASA software can handle data from single-dish, aperture-synthesis, and Very Long Baseline Interferometery (VLBI) telescopes. One of its core functionalities is to support the calibration and imaging pipelines for ALMA, VLA, VLA Sky Survey, and the Nobeyama 45 m telescope. This paper presents a high-level overview of the basic structure of the CASA software, as well as procedures for calibrating and imaging astronomical radio data in CASA. CASA is being developed by an international consortium of scientists and software engineers based at the National Radio Astronomy Observatory (NRAO), the European Southern Observatory, the National Astronomical Observatory of Japan, and the Joint Institute for VLBI European Research Infrastructure Consortium (JIV-ERIC), under the guidance of NRAO.
Journal Article
CASA on the Fringe—Development of VLBI Processing Capabilities for CASA
by
Goddi, Ciriaco
,
Verkouter, Marjolein
,
Moellenbrock, George A.
in
Astronomy software
,
Calibration
,
Data processing
2022
New functionality to process Very Long Baseline Interferometry (VLBI) data has been implemented in the CASA package. This includes two new tasks to handle fringe fitting and VLBI-specific amplitude calibration steps. Existing tasks have been adjusted to handle VLBI visibility data and calibration meta-data properly. With these updates, it is now possible to process VLBI continuum and spectral line observations in CASA. This article describes the development and implementation, and presents an outline for the workflow when calibrating European VLBI Network or Very Long Baseline Array data in CASA. Though the CASA VLBI functionality has already been vetted extensively as part of the Event Horizon Telescope data processing, in this paper we compare results for the same data set processed in CASA and AIPS. We find identical results for the two packages and conclude that CASA in some cases performs better, though it cannot match AIPS for single-core processing time. The new functionality in CASA allows for easy development of pipelines or Jupyter notebooks, and thus contributes to raising VLBI data processing to present day standards for accessibility, reproducibility, and reusability.
Journal Article
The SNAD Viewer: Everything You Want to Know about Your Favorite ZTF Object
by
Korolev, Vladimir S.
,
Kornilov, Matwey V.
,
Voloshina, Anastasiya
in
Astronomy
,
Astronomy software
,
Astronomy web services
2023
We describe the SNAD Viewer , a web portal for astronomers which presents a centralized view of individual objects from the Zwicky Transient Facility’s (ZTF) data releases, including data gathered from multiple publicly available astronomical archives and data sources. Initially built to enable efficient expert feedback in the context of adaptive machine learning applications, it has evolved into a full-fledged community asset that centralizes public information and provides a multi-dimensional view of ZTF sources. For users, we provide detailed descriptions of the data sources and choices underlying the information displayed in the portal. For developers, we describe our architectural choices and their consequences such that our experience can help others engaged in similar endeavors or in adapting our publicly released code to their requirements. The infrastructure we describe here is scalable and flexible and can be personalized and used by other surveys and for other science goals. The Viewer has been instrumental in highlighting the crucial roles domain experts retain in the era of big data in astronomy. Given the arrival of the upcoming generation of large-scale surveys, we believe similar systems will be paramount in enabling an optimal exploitation of the scientific potential enclosed in current terabyte and future petabyte-scale data sets. The Viewer is publicly available online at https://ztf.snad.space .
Journal Article
A Characterization of the ALMA Phasing System at 345 GHz
by
Martí-Vidal, I.
,
Rottmann, H.
,
Goddi, C.
in
Astronomical instrumentation
,
Astronomical techniques
,
Astronomy data acquisition
2023
The development of the Atacama Large Millimeter/submillimeter Array (ALMA) phasing system (APS) has allowed ALMA to function as an extraordinarily sensitive station for very long baseline interferometry (VLBI) at frequencies of up to 230 GHz ( λ ≈ 1.3 mm). Efforts are now underway to extend the use of the APS to 345 GHz ( λ ≈ 0.87 mm). Here we report a characterization of APS performance at 345 GHz based on a series of tests carried out between 2015 and 2021, including a successful global VLBI test campaign conducted in 2018 October in collaboration with the Event Horizon Telescope.
Journal Article
An Algorithm for Coordinate Matching in World Coordinate Solutions
2020
Algorithms for point source extraction and catalog-to-image coordinate matching for world coordinate solutions are presented. In particular the coordinate matching algorithm is lightweight, simple to understand, easy to code, and solves orders of magnitude more quickly than existing solutions to this common astrometric problem.
Journal Article
Automated Extended Aperture Photometry of K2 Variable Stars
by
Szabó, Róbert
,
Szabó, Pál
,
Molnár, László
in
Astronomy software
,
Photometry
,
Pulsating variable stars
2022
Proper photometric data are challenging to obtain in the K2 mission of the Kepler space telescope due to strong systematics caused by the two-wheel-mode operation. It is especially true for variable stars wherein physical phenomena occur on timescales similar to the instrumental signals. We originally developed a method with the aim to extend the photometric aperture to be able to compensate the motion of the telescope which we named Extended Aperture Photometry (EAP). Here we present the outline of the automatized version of the EAP method, an open-source pipeline called autoEAP . We compare the light curve products to other photometric solutions for examples chosen from high-amplitude variable stars. In addition to the photometry, we developed a new detrending method, which is based on phase dispersion minimization and is able to eliminate long-term instrumental signals for periodic variable stars.
Journal Article
The Autonomous Data Reduction Pipeline for the Cute Mission
by
France, Kevin
,
Haas, Stephanie
,
Fleming, Brian T.
in
Adaptability
,
Astronomy software
,
Data reduction
2022
The Colorado Ultraviolet Transit Experiment (CUTE) is a 6U NASA CubeSat carrying on-board a low-resolution, near-ultraviolet (2479–3306 Å) spectrograph. It has a Cassegrain telescope with a rectangular primary to maximize the collecting area, given the shape of the satellite bus, and an aberration correcting grating to improve the image quality, and thus spectral resolution. CUTE, launched on the of 2021 September 27th to a Low Earth Orbit, is designed to monitor transiting extra-solar planets orbiting bright, nearby stars to improve our understanding of planet atmospheric escape and star-planet interaction processes. We present here the CUTE autONomous daTa ReductiOn pipeLine, developed for reducing CUTE data. The pipeline has been structured with a modular approach, which also considers scalability and adaptability to other missions carrying on-board a long-slit spectrograph. The CUTE data simulator has been used to generate synthetic observations used for developing and testing the pipeline functionalities. The pipeline has been tested and updated employing flight data obtained during commissioning and initial science operations of the mission.
Journal Article
Asteroid Observations from the Transiting Exoplanet Survey Satellite: Detection Processing Pipeline and Results from Primary Mission Data
by
Vaillancourt, John E.
,
Woods, Deborah F.
,
Kotson, Michael C.
in
Algorithms
,
Asteroid observations
,
Asteroids
2021
The Transiting Exoplanet Survey Satellite (TESS) is a NASA Explorer-class mission designed for finding exoplanets around nearby stars. TESS image data can also serve as a valuable resource for asteroid and comet detection, including near-Earth objects (NEOs). In order to exploit the TESS image data for moving object detection and potential object discovery, our team has developed an image processing pipeline as part of the Lincoln Near-Earth Asteroid Research (LINEAR) program, sponsored by the NASA NEO Observations Program. The LINEAR-TESS pipeline is currently in operation and reporting asteroid observations to the Minor Planet Center. In this paper we discuss the algorithms and methodology utilized to push the limits of the astrometric accuracy and photometric sensitivity of the TESS instrument for asteroid detection without a priori information on the ephemerides of the objects, and report on observation statistics from the first two years of TESS mission data.
Journal Article
Wayne State University’s Dan Zowada Memorial Observatory: Characterization and Pipeline of a 0.5 m Robotic Telescope
by
Carr, Robert
,
Cinabro, David
,
Moutard, David
in
Active galactic nuclei
,
Astronomy
,
Astronomy data acquisition
2022
Wayne State University’s Dan Zowada Memorial Observatory is a fully robotic 0.5 m telescope and imaging system located under the dark skies of New Mexico. The observatory is particularly suited to time domain astronomy: the observation of variable objects, such as tidal disruption events, supernovae, and active galactic nuclei. We have developed a software suite for image reduction, alignment and stacking, and calculation of absolute photometry in the Sloan filters used at the telescope. Our pipeline also performs image subtraction to enable photometry of objects embedded in bright backgrounds such as galaxies. The 5 σ detection limit of the Zowada Observatory for integration of 16 × 90 s exposures is 19.0 mag in g -band, 18.1 mag in r -band, 17.9 mag in i -band, and 16.6 mag in z -band. For a 3 σ detection limit, measurements may be performed with greater uncertainties as deep as 19.9, 19.1. 18.9 and 17.5 mag in griz bands, respectively.
Journal Article
Model-based Performance Characterization of Software Correlators for Radio Interferometer Arrays
2022
Correlation for radio interferometer array applications, including Very Long Baseline Interferometry (VLBI), is a multidisciplinary field that traditionally involves astronomy, geodesy, signal processing, and electronic design. In recent years, however, high-performance computing has been taking over electronic design, complicating this mix with the addition of network engineering, parallel programming, and resource scheduling, among others. High-performance applications go a step further by using specialized hardware like Graphics Processing Units (GPUs) or Field Programmable Gate Arrays (FPGAs), challenging engineers to build and maintain high-performance correlators that efficiently use the available resources. Existing literature has generally benchmarked correlators through narrow comparisons on specific scenarios, and the lack of a formal performance characterization prevents a systematic comparison. This combination of ongoing increasing complexity in software correlation together with the lack of performance models in the literature motivates the development of a performance model that allows us not only to characterize existing correlators and predict their performance in different scenarios but, more importantly, to provide an understanding of the trade-offs inherent to the decisions associated with their design. In this paper, we present a model that achieves both objectives. We validate this model against benchmarking results in the literature, and provide an example for its application for improving cost-effectiveness in the usage of cloud resources.
Journal Article