Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
31 result(s) for "Cisewski-Kehe, Jessi"
Sort by:
A HERMITE–GAUSSIAN BASED EXOPLANET RADIAL VELOCITY ESTIMATION METHOD
As the first successful technique used to detect exoplanets orbiting distant stars, the radial velocity method aims to detect a periodic Doppler shift in a stellar spectrum due to the star's motion along the line sight. We introduce a new, mathematically rigorous approach to detect such a signal that accounts for the smooth functional relationship of neighboring wavelengths in the spectrum, minimizes the role of wavelength interpolation, accounts for heteroskedastic noise and easily allows for accurate calculation of the estimated radial velocity standard error. Using Hermite–Gaussian functions, we show that the problem of detecting a Doppler shift in the spectrum can be reduced to linear regression in many settings. A simulation study demonstrates that the proposed method is able to accurately estimate an individual spectrum's radial velocity with precision below 0.3 m s−1, corresponding to a Doppler shift much smaller than the size of a spectral pixel. Furthermore, the new method outperforms the traditional cross-correlation function approach for estimating the radial velocity by reducing the root mean squared error up to 15 cm s−1. The proposed method is also demonstrated on a new set of observations from the EXtreme PREcision Spectrometer (EXPRES) for the host star 51 Pegasi, and successfully recovers estimates of the planetary companion's parameters that agree well with previous studies. The method is implemented in the R package rvmethod, and supplemental Python code is also available.
High-energy Neutrino Source Cross-correlations with Nearest-neighbor Distributions
The astrophysical origins of the majority of the IceCube neutrinos remain unknown. Effectively characterizing the spatial distribution of the neutrino samples and associating the events with astrophysical source catalogs can be challenging given the large atmospheric neutrino background and underlying non-Gaussian spatial features in the neutrino and source samples. In this paper, we investigate a framework for identifying and statistically evaluating the cross-correlations between IceCube data and an astrophysical source catalog based on the \\(k\\)-nearest-neighbor cumulative distribution functions (\\(k\\)NN-CDFs). We propose a maximum likelihood estimation procedure for inferring the true proportions of astrophysical neutrinos in the point-source data. We conduct a statistical power analysis of an associated likelihood ratio test with estimations of its sensitivity and discovery potential with synthetic neutrino data samples and a WISE-2MASS galaxy sample. We apply the method to IceCube's public ten-year point-source data and find no statistically significant evidence for spatial cross-correlations with the selected galaxy sample. We discuss possible extensions to the current method and explore the method's potential to identify the cross-correlation signals in data sets with different sample sizes.
MaxTDA: Robust Statistical Inference for Maximal Persistence in Topological Data Analysis
Persistent homology is an area within topological data analysis (TDA) that can uncover different dimensional holes (connected components, loops, voids, etc.) in data. The holes are characterized, in part, by how long they persist across different scales. Noisy data can result in many additional holes that are not true topological signal. Various robust TDA techniques have been proposed to reduce the number of noisy holes, however, these robust methods have a tendency to also reduce the topological signal. This work introduces Maximal TDA (MaxTDA), a statistical framework addressing a limitation in TDA wherein robust inference techniques systematically underestimate the persistence of significant homological features. MaxTDA combines kernel density estimation with level-set thresholding via rejection sampling to generate consistent estimators for the maximal persistence features that minimizes bias while maintaining robustness to noise and outliers. We establish the consistency of the sampling procedure and the stability of the maximal persistence estimator. The framework also enables statistical inference on topological features through rejection bands, constructed from quantiles that bound the estimator's deviation probability. MaxTDA is particularly valuable in applications where precise quantification of statistically significant topological features is essential for revealing underlying structural properties in complex datasets. Numerical simulations across varied datasets, including an example from exoplanet astronomy, highlight the effectiveness of MaxTDA in recovering true topological signals.
A Subsequence Approach to Topological Data Analysis for Irregularly-Spaced Time Series
A time-delay embedding (TDE), grounded in the framework of Takens's Theorem, provides a mechanism to represent and analyze the inherent dynamics of time-series data. Recently, topological data analysis (TDA) methods have been applied to study this time series representation mainly through the lens of persistent homology. Current literature on the fusion of TDE and TDA are adept at analyzing uniformly-spaced time series observations. This work introduces a novel {\\em subsequence} embedding method for irregularly-spaced time-series data. We show that this method preserves the original state space topology while reducing spurious homological features. Theoretical stability results and convergence properties of the proposed method in the presence of noise and varying levels of irregularity in the spacing of the time series are established. Numerical studies and an application to real data illustrates the performance of the proposed method.
A Divide-and-Conquer Approach to Persistent Homology
Persistent homology is a tool of topological data analysis that has been used in a variety of settings to characterize different dimensional holes in data. However, persistent homology computations can be memory intensive with a computational complexity that does not scale well as the data size becomes large. In this work, we propose a divide-and-conquer (DaC) method to mitigate these issues. The proposed algorithm efficiently finds small, medium, and large-scale holes by partitioning data into sub-regions and uses a Vietoris-Rips filtration. Furthermore, we provide theoretical results that quantify the bottleneck distance between DaC and the true persistence diagram and the recovery probability of holes in the data. We empirically verify that the rate coincides with our theoretical rate, and find that the memory and computational complexity of DaC outperforms an alternative method that relies on a clustering preprocessing step to reduce the memory and computational complexity of the persistent homology computations. Finally, we test our algorithm using spatial data of the locations of lakes in Wisconsin, where the classical persistent homology is computationally infeasible.
Searching for Low-Mass Exoplanets Amid Stellar Variability with a Fixed Effects Linear Model of Line-by-Line Shape Changes
The radial velocity (RV) method, also known as Doppler spectroscopy, is a powerful technique for exoplanet discovery and characterization. In recent years, progress has been made thanks to the improvements in the quality of spectra from new extreme precision RV spectrometers. However, detecting the RV signals of Earth-like exoplanets remains challenging, as the spectroscopic signatures of low-mass planets can be obscured or confused with intrinsic stellar variability. Changes in the shapes of spectral lines across time can provide valuable information for disentangling stellar activity from true Doppler shifts caused by low-mass exoplanets. In this work, we present a fixed effects linear model to estimate RV signals that controls for changes in line shapes by aggregating information from hundreds of spectral lines. Our methodology incorporates a wild-bootstrap approach for modeling uncertainty and cross-validation to control for overfitting. We evaluate the model's ability to remove stellar activity using solar observations from the NEID spectrograph, as the sun's true center-of-mass motion is precisely known. Including line shape-change covariates reduces the RV root-mean-square errors by approximately 70% (from 1.919 m s\\(^{-1}\\) to 0.575 m s\\(^{-1}\\)) relative to using only the line-by-line Doppler shifts. The magnitude of the residuals is significantly less than that from traditional CCF-based RV estimators and comparable to other state-of-the-art methods for mitigating stellar variability.
Tensor Computation of Euler Characteristic Functions and Transforms
The weighted Euler characteristic transform (WECT) and Euler characteristic function (ECF) have proven to be useful tools in a variety of applications. However, current methods for computing these functions are either not optimized for GPU computation or do not scale to higher-dimensional settings. In this work, we present a tensor-based framework for computing such topological descriptors which is highly optimized for GPU architectures and works in full generality across simplicial and cubical complexes of arbitrary dimension. Experimentally, the framework demonstrates significant speedups over existing methods when computing the WECT and ECF across a variety of two- and three-dimensional datasets. Computation of these transforms is implemented in a publicly available Python package called pyECT.
Vectorized Computation of Euler Characteristic Functions and Transforms
The weighted Euler characteristic transform (WECT) and Euler characteristic function (ECF) have proven to be useful tools in a variety of applications. However, current methods for computing these functions are either not optimized for speed or do not scale to higher-dimensional settings. In this work, we present a vectorized framework for computing such topological descriptors using tensor operations, which is highly optimized for GPU architectures and works in full generality across simplicial and cubical complexes of arbitrary dimension. Experimentally, the framework demonstrates significant speedups over existing methods when computing the WECT and ECF across a variety of two- and three-dimensional datasets. Computation of these transforms is implemented in a publicly available Python package called pyECT.
Tracking Temporal Evolution of Topological Features in Image Data
Topological Data Analysis (TDA) can be used to detect and characterize holes in an image, such as zero-dimensional holes (connected components) or one-dimensional holes (loops). However, there is currently no widely accepted statistical framework for modeling spatiotemporal dependence in the evolution of topological features, such as holes, within a time series of images. We propose a hypothesis testing framework to identify statistically significant topological features of images in space and time, simultaneously. This addition of time may induce higher-dimensional topological features which can be used to establish temporal connections between the lower-dimensional features at each point in time. The temporal evolution of these lower-dimensional features is then represented on a zigzag persistence diagram, as a topological summary statistic focused on time dynamics. We demonstrate that the method effectively captures the emergence and progression of topological features in a study of a series of images of a wounded cell as it repairs. The proposed method outperforms a current approach in a simulation study that includes features of the wound healing process. Since, the wounded cell images exhibit nonlinear, dynamic, spatial, and temporal structures during single-cell repair, they provide a good application for this method.
A Preferential Attachment Model for the Stellar Initial Mass Function
Accurate specification of a likelihood function is becoming increasingly difficult in many inference problems in astronomy. As sample sizes resulting from astronomical surveys continue to grow, deficiencies in the likelihood function lead to larger biases in key parameter estimates. These deficiencies result from the oversimplification of the physical processes that generated the data, and from the failure to account for observational limitations. Unfortunately, realistic models often do not yield an analytical form for the likelihood. The estimation of a stellar initial mass function (IMF) is an important example. The stellar IMF is the mass distribution of stars initially formed in a given cluster of stars, a population which is not directly observable due to stellar evolution and other disruptions and observational limitations of the cluster. There are several difficulties with specifying a likelihood in this setting since the physical processes and observational challenges result in measurable masses that cannot legitimately be considered independent draws from an IMF. This work improves inference of the IMF by using an approximate Bayesian computation approach that both accounts for observational and astrophysical effects and incorporates a physically-motivated model for star cluster formation. The methodology is illustrated via a simulation study, demonstrating that the proposed approach can recover the true posterior in realistic situations, and applied to observations from astrophysical simulation data.