Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
195,205 result(s) for "rate models"
Sort by:
Multifactorial Heath-Jarrow-Morton model using principal component analysis
In this study, we propose an implementation of the multifactor Heath-Jarrow-Morton (HJM) interest rate model using an approach that integrates principal component analysis (PCA) and Monte Carlo simulation (MCS) techniques. By integrating PCA and MCS with the multifactor HJM model, we successfully capture the principal factors driving the evolution of short-term interest rates in the US market. Additionally, we provide a framework for deriving spot interest rates through parameter calibration and forward rate estimation. For this, we use daily data from the US yield curve from June 2017 to December 2019. The integration of PCA, MCS with multifactor HJM model in this study represents a robust and precise approach to characterizing interest rate dynamics and compared to previous approaches, this method provided greater accuracy and improved understanding of the factors influencing US Treasury Yield interest rates.
How well do contemporary theories explain floating exchange rate changes in an emerging economy: The case of EUR/PLN
The purpose of this paper is to investigate how well contemporary exchange rate theories explain fluctuations in exchange rates of emerging economies, before and after the Global Financial Crisis (GFC). As an example, the EUR/PLN exchange rate in 1999-2015 was selected as the currency pair that was the most liquid in the region; it had a stable exchange rate regime in the given period. The whole analysis was performed within the selected linear vector error correction (VEC) model framework. VEC models incorporate such well-known theories as purchasing power parity (PPP), the uncovered interest rate parity (UIP), the Harrod-Balassa-Samuelson (HBS) effect, the terms of trade (TOT), the net financial asset (NFA) theory and risk premium. The results indicate the greater importance of external factors-in particular, the Euro Area (EA) short-term interest rates and EA price shocks after the GFC. The main sources of EUR/PLN variability were found to be exchange rate shocks, terms of trade shocks and foreign and domestic short-term interest rate shocks, as well as foreign price shocks. These results are of particularly high importance for our own exchange rate shocks and indicate that a large part of exchange rate fluctuations in EUR/PLN still remains unexplained.
Review of Hysteresis Models for Magnetic Materials
There are several models for magnetic hysteresis. Their key purposes are to model magnetization curves with a history dependence to achieve hysteresis cycles without a frequency dependence. There are different approaches to handling history dependence. The two main categories are Duhem-type models and Preisach-type models. Duhem models handle it via a simple directional dependence on the flux rate, without a proper memory. While the Preisach type model handles it via memory of the point where the direction of the flux rate is changed. The most common Duhem model is the phenomenological Jiles–Atherton model, with examples of other models including the Coleman–Hodgdon model and the Tellinen model. Examples of Preisach type models are the classical Preisach model and the Prandtl–Ishlinskii model, although there are also many other models with adoptions of a similar history dependence. Hysteresis is by definition rate-independent, and thereby not dependent on the speed of the alternating flux density. An additional rate dependence is still important and often included in many dynamic hysteresis models. The Chua model is common for modeling non-linear dynamic magnetization curves; however, it does not define classical hysteresis. Other similar adoptions also exist that combine hysteresis modeling with eddy current modeling, similar to how frequency dependence is included in core loss modeling. Most models are made for scalar values of alternating fields, but there are also several models with vector generalizations that also consider three-dimensional directions.
On some stochastic comparisons of arithmetic and geometric mixture models
Most studies on reliability analysis have been conducted in homogeneous populations. However, homogeneous populations can rarely be found in the real world. Populations with specific components, such as lifetime, are usually heterogeneous. When populations are heterogeneous, it raises the question of whether these different modeling analysis strategies might be appropriate and which one of them should be preferred. In this paper, we provide mixture models, which have usually been effective tools for modeling heterogeneity in populations. Specifically, we carry out a stochastic comparison of two arithmetic (finite) mixture models using the majorization concept in the sense of the usual stochastic order, the hazard rate order, the reversed hazard rate order and the dispersive order both for a general case and for some semiparametric families of distributions. Moreover, we obtain sufficient conditions to compare two geometric mixture models. To illustrate the theoretical findings, some relevant examples and counterexamples are presented.
Two‐Component Mixture Cure Rate Model with Spline Estimated Nonparametric Components
In some survival analysis of medical studies, there are often long‐term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two‐component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time–covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation–maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point‐wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study.
Nadarajah-Haghighi Model for Survival Data With Long Term Survivors in the Presence of Right Censored Data
In this paper, a new long term survival model called Nadarajah-Haghighi model for survival data with long term survivors was proposed. The model is used in fitting data where the population of interest is a mixture of individuals that are susceptible to the event of interest and individuals that are not susceptible to the event of interest. The statistical properties of the proposed model including quantile function, moments, mean and variance were provided. Maximum likelihood estimation procedure was used to estimate the parameters of the model assuming right censoring. Furthermore, Bayesian method of estimation was also employed in estimating the parameters of the model assuming right censoring. Simulations study was performed in order to ascertain the performances of the MLE estimators. Random samples of different sample sizes were generated from the model with some arbitrary values for the parameters for 5%, 1:3% and 1:5% cure fraction values. Bias, standard error and mean square error were used as discrimination criteria. Additionally, we compared the performance of the proposed model with some competing models. The results of the applications indicates that the proposed model is more efficient than the models compared with. Finally, we fitted some models considering type of treatment as a covariate. It was observed that the covariate  have effect on the shape parameter of the proposed model.
Geoadditive Survival Models
Survival data often contain small-area geographical or spatial information, such as the residence of individuals. In many cases, the impact of such spatial effects on hazard rates is of considerable substantive interest. Therefore, extensions of known survival or hazard rate models to spatial models have been suggested. Mostly, a spatial component is added to the usual linear predictor of the Cox model. In this article flexible continuous-time geoadditive models are proposed, extending the Cox model with respect to several aspects often needed in applications. The common linear predictor is generalized to an additive predictor, including nonparametric components for the log-baseline hazard, time-varying effects, and possibly nonlinear effects of continuous covariates or further time scales, and a spatial component for geographical effects. In addition, uncorrelated frailty effects or nonlinear two-way interactions can be incorporated. Inference is developed within a unified fully Bayesian framework. Penalized regression splines and Markov random fields are suggested as basic building blocks, and geostatistical (kriging) models are also considered. Posterior analysis uses computationally efficient Markov chain Monte Carlo sampling schemes. Smoothing parameters are an integral part of the model and are estimated automatically. Propriety of posteriors is shown under fairly general conditions, and practical performance is investigated through simulation studies. Our approach is applied to data from a case study in London and Essex that aims to estimate the effect of area of residence and further covariates on waiting times to coronary artery bypass grafting. Results provide clear evidence of nonlinear time-varying effects, and considerable spatial variability of waiting times to bypass grafting.