Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
8,633
result(s) for
"Splines"
Sort by:
ASYMPTOTIC PROPERTIES OF PENALIZED SPLINE ESTIMATORS IN CONCAVE EXTENDED LINEAR MODELS
2021
This paper develops a general theory on rates of convergence of penalized spline estimators for function estimation when the likelihood functional is concave in candidate functions, where the likelihood is interpreted in a broad sense that includes conditional likelihood, quasi-likelihood and pseudo-likelihood. The theory allows all feasible combinations of the spline degree, the penalty order and the smoothness of the unknown functions. According to this theory, the asymptotic behaviors of the penalized spline estimators depends on interplay between the spline knot number and the penalty parameter. The general theory is applied to obtain results in a variety of contexts, including regression, generalized regression such as logistic regression and Poisson regression, density estimation, conditional hazard function estimation for censored data, quantile regression, diffusion function estimation for a diffusion type process and estimation of spectral density function of a stationary time series. For multidimensional function estimation, the theory (presented in the Supplementary Material) covers both penalized tensor product splines and penalized bivariate splines on triangulations.
Journal Article
Transverse Key vs Spline Shaft: Efficiency and Design Trade-Offs in Torque Transmission
2025
This research evaluates transverse keys against various spline connections for torque transfer efficiency. Fine splines demonstrate superior performance overall, while transverse keys offer competitive advantages for narrow hubs and cost-sensitive applications. The findings guide designers in selecting suitable connection types based on torque requirements, manufacturing complexity, and practical constraints.
Journal Article
A Penalized Framework for Distributed Lag Non-Linear Models
by
Kenward, Michael G.
,
Scheipl, Fabian
,
Gasparrini, Antonio
in
Applications programs
,
biometry
,
Computer simulation
2017
Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis.
Journal Article
Acute Effects of Ambient Particulate Matter on Mortality in Europe and North America: Results from the APHENA Study
by
Burnett, Rick
,
Katsouyanni, Klea
,
Peng, Roger
in
Age groups
,
Air Pollutants - toxicity
,
Air pollution
2008
Background: The APHENA (Air Pollution and Health: A Combined European and North American Approach) study is a collaborative analysis of multicity time-series data on the effect of air pollution on population health, bringing together data from the European APHEA (Air Pollution and Health: A European Approach) and U.S. NMMAPS (National Morbidity, Mortality and Air Pollution Study) projects, along with Canadian data. Objectives: The main objective of APHENA was to assess the coherence of the findings of the multicity studies carried out in Europe and North America, when analyzed with a common protocol, and to explore sources of possible heterogeneity. We present APHENA results on the effects of particulate matter (PM) ≤ 10 μm in aerodynamic diameter (PM₁₀) on the daily number of deaths for all ages and for those < 75 and ≥ 75 years of age. We explored the impact of potential environmental and socioeconomic factors that may modify this association. Methods: In the first stage of a two-stage analysis, we used Poisson regression models, with natural and penalized splines, to adjust for seasonality, with various degrees of freedom. In the second stage, we used meta-regression approaches to combine time-series results across cites and to assess effect modification by selected ecologic covariates. Results: Air pollution risk estimates were relatively robust to different modeling approaches. Risk estimates from Europe and United States were similar, but those from Canada were substantially higher. The combined effect of PM₁₀ on all-cause mortality across all ages for cities with daily air pollution data ranged from 0.2% to 0.6% for a 10-μg/m³ increase in ambient PM₁₀ concentration. Effect modification by other pollutants and climatic variables differed in Europe and the United States. In both of these regions, a higher proportion of older people and higher unemployment were associated with increased air pollution risk. Conclusions: Estimates of the increased mortality associated with PM air pollution based on the APHENA study were generally comparable with results of previous reports. Overall, risk estimates were similar in Europe and in the United States but higher in Canada. However, PM₁₀ effect modification patterns were somewhat different in Europe and the United States.
Journal Article
ADAPTIVE PIECEWISE POLYNOMIAL ESTIMATION VIA TREND FILTERING
2014
We study trend filtering, a recently proposed tool of Kim et al. [SIAM Rev. 51 (2009) 339-360] for nonparametric regression. The trend filtering estimate is defined as the minimizer of a penalized least squares criterion, in which the penalty term sums the absolute kth order discrete derivatives over the input points. Perhaps not surprisingly, trend filtering estimates appear to have the structure of kth degree spline functions, with adaptively chosen knot points (we say \"appear\" here as trend filtering estimates are not really functions over continuous domains, and are only defined over the discrete set of inputs). This brings to mind comparisons to other nonparametric regression tools that also produce adaptive splines; in particular, we compare trend filtering to smoothing splines, which penalize the sum of squared derivatives across input points, and to locally adaptive regression splines [Ann. Statist. 25 (1997) 387-413], which penalize the total variation of the kth derivative. Empirically, we discover that trend filtering estimates adapt to the local level of smoothness much better than smoothing splines, and further, they exhibit a remarkable similarity to locally adaptive regression splines. We also provide theoretical support for these empirical findings; most notably, we prove that (with the right choice of tuning parameter) the trend filtering estimate converges to the true underlying function at the minimax rate for functions whose kth derivative is of bounded variation. This is done via an asymptotic pairing of trend filtering and locally adaptive regression splines, which have already been shown to converge at the minimax rate [Ann. Statist. 25 (1997) 387-413]. At the core of this argument is a new result tying together the fitted values of two lasso problems that share the same outcome vector, but have different predictor matrices.
Journal Article
ADDITIVE MODELS WITH TREND FILTERING
by
Tibshirani, Ryan J.
,
Sadhanala, Veeranjaneyulu
in
Algorithms
,
Discrete element method
,
Estimating techniques
2019
We study additive models built with trend filtering, that is, additive models whose components are each regularized by the (discrete) total variation of their kth (discrete) derivative, for a chosen integer k ≥ 0. This results in kth degree piecewise polynomial components, (e.g., k = 0 gives piecewise constant components, k = 1 gives piecewise linear, k = 2 gives piecewise quadratic, etc.). Analogous to its advantages in the univariate case, additive trend filtering has favorable theoretical and computational properties, thanks in large part to the localized nature of the (discrete) total variation regularizer that it uses. On the theory side, we derive fast error rates for additive trend filtering estimates, and show these rates are minimax optimal when the underlying function is additive and has component functions whose derivatives are of bounded variation. We also show that these rates are unattainable by additive smoothing splines (and by additive models built from linear smoothers, in general). On the computational side, we use backfitting, to leverage fast univariate trend filtering solvers; we also describe a new backfitting algorithm whose iterations can be run in parallel, which (as far as we can tell) is the first of its kind. Lastly, we present a number of experiments to examine the empirical performance of trend filtering.
Journal Article
ADAPTIVE RISK BOUNDS IN UNIVARIATE TOTAL VARIATION DENOISING AND TREND FILTERING
by
Lieu, Donovan
,
Chatterjee, Sabyasachi
,
Guntuboyina, Adityanand
in
Adaptive filters
,
Asymptotic methods
,
Discrete element method
2020
We study trend filtering, a relatively recent method for univariate non-parametric regression. For a given integer r ≥ 1, the rth order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute rth order discrete derivatives of the fitted function at the design points. For r = 1, the estimator reduces to total variation regularization which has received much attention in the statistics and image processing literature. In this paper, we study the performance of the trend filtering estimator for every r ≥ 1, both in the constrained and penalized forms. Our main results show that in the strong sparsity setting when the underlying function is a (discrete) spline with few “knots,” the risk (under the global squared error loss) of the trend filtering estimator (with an appropriate choice of the tuning parameter) achieves the parametric
n
−1-rate, up to a logarithmic (multiplicative) factor. Our results therefore provide support for the use of trend filtering, for every r ≥ 1, in the strong sparsity setting.
Journal Article
The Estimating Parameter and Number of Knots for Nonparametric Regression Methods in Modelling Time Series Data
2024
This research aims to explore and compare several nonparametric regression techniques, including smoothing splines, natural cubic splines, B-splines, and penalized spline methods. The focus is on estimating parameters and determining the optimal number of knots to forecast cyclic and nonlinear patterns, applying these methods to simulated and real-world datasets, such as Thailand’s coal import data. Cross-validation techniques are used to control and specify the number of knots, ensuring the curve fits the data points accurately. The study applies nonparametric regression to forecast time series data with cyclic patterns and nonlinear forms in the dependent variable, treating the independent variable as sequential data. Simulated data featuring cyclical patterns resembling economic cycles and nonlinear data with complex equations to capture variable interactions are used for experimentation. These simulations include variations in standard deviations and sample sizes. The evaluation criterion for the simulated data is the minimum average mean square error (MSE), which indicates the most efficient parameter estimation. For the real data, monthly coal import data from Thailand is used to estimate the parameters of the nonparametric regression model, with the MSE as the evaluation metric. The performance of these techniques is also assessed in forecasting future values, where the mean absolute percentage error (MAPE) is calculated. Among the methods, the natural cubic spline consistently yields the lowest average mean square error across all standard deviations and sample sizes in the simulated data. While the natural cubic spline excels in parameter estimation, B-splines show strong performance in forecasting future values.
Journal Article
Improved Dynamic Predictions from Joint Models of Longitudinal and Survival Data with Time-Varying Effects Using P-Splines
by
Rizopoulos, Dimitris
,
Andrinopoulou, Eleni-Rosalina
,
Eilers, Paul H. C.
in
Aorta
,
Aortic Valve - diagnostic imaging
,
Aortic Valve - transplantation
2018
In the field of cardio-thoracic surgery, valve function is monitored over time after surgery. The motivation for our research comes from a study which includes patients who received a human tissue valve in the aortic position. These patients are followed prospectively over time by standardized echocardiographic assessment of valve function. Loss of follow-up could be caused by valve intervention or the death of the patient. One of the main characteristics of the human valve is that its durability is limited. Therefore, it is of interest to obtain a prognostic model in order for the physicians to scan trends in valve function over time and plan their next intervention, accounting for the characteristics of the data. Several authors have focused on deriving predictions under the standard joint modeling of longitudinal and survival data framework that assumes a constant effect for the coefficient that links the longitudinal and survival outcomes. However, in our case, this may be a restrictive assumption. Since the valve degenerates, the association between the biomarker with survival may change over time. To improve dynamic predictions, we propose a Bayesian joint model that allows a time-varying coefficient to link the longitudinal and the survival processes, using P-splines. We evaluate the performance of the model in terms of discrimination and calibration, while accounting for censoring.
Journal Article