Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
164
result(s) for
"Scherer, Matthias"
Sort by:
Simulating copulas : stochastic models, sampling algorithms, and applications
This tome provides the reader with a background on simulating copulas and multivariate distribution in general. It unifies the scattered literature on the simulation of various families of copulas as well as on different construction principles.
Modeling Recovery Rates of Small- and Medium-Sized Entities in the US
by
Schischke, Amelie
,
Zagst, Rudi
,
Min, Aleksey
in
decision tree
,
loss given default
,
mixture model
2020
A sound statistical model for recovery rates is required for various applications in quantitative risk management, with the computation of capital requirements for loan portfolios as one important example. We compare different models for predicting the recovery rate on borrower level including linear and quantile regressions, decision trees, neural networks, and mixture regression models. We fit and apply these models on the worldwide largest loss and recovery data set for commercial loans provided by GCD, where we focus on small- and medium-sized entities in the US. Additionally, we include macroeconomic information via a predictive Crisis Indicator or Crisis Probability indicating whether economic downturn scenarios are expected within the time of resolution. The horserace is won by the mixture regression model which regresses the densities as well as the probabilities that an observation belongs to a certain component.
Journal Article
A comprehensive model for cyber risk based on marked point processes and its application to insurance
by
Zeller, Gabriela
,
Scherer, Matthias
in
Accumulation risk
,
Actuarial science
,
Applications of Mathematics
2022
After scrutinizing technical, legal, financial, and actuarial aspects of cyber risk, a new approach for modelling cyber risk using marked point processes is proposed. Key covariates, required to model frequency and severity of cyber claims, are identified. The presented framework explicitly takes into account incidents from malicious untargeted and targeted attacks as well as accidents and failures. The resulting model is able to include the dynamic nature of cyber risk, while capturing accumulation risk in a realistic way. The model is studied with respect to its statistical properties and applied to the pricing of cyber insurance and risk measurement. The results are illustrated in a simulation study.
Journal Article
Risk mitigation services in cyber insurance: optimal contract design and price structure
2023
As the cyber insurance market is expanding and cyber insurance policies continue to mature, the potential of including pre-incident and post-incident services into cyber policies is being recognised by insurers and insurance buyers. This work addresses the question of how such services should be priced from the insurer’s viewpoint, i.e. under which conditions it is rational for a profit-maximising, risk-neutral or risk-averse insurer to share the costs of providing risk mitigation services. The interaction between insurance buyer and seller is modelled as a Stackelberg game, where both parties use distortion risk measures to model their individual risk aversion. After linking the notions of pre-incident and post-incident services to the concepts of self-protection and self-insurance, we show that when pricing a single contract, the insurer would always shift the full cost of self-protection services to the insured; however, this does not generally hold for the pricing of self-insurance services or when taking a portfolio viewpoint. We illustrate the latter statement using toy examples of risks with dependence mechanisms representative in the cyber context.
Journal Article
Is accumulation risk in cyber methodically underestimated?
by
Zeller, Gabriela
,
Scherer, Matthias
in
Accumulation Risk
,
Applications of Mathematics
,
Cyber Insurance
2024
Many insurers have started to underwrite cyber in recent years. In parallel, they developed their first actuarial models to cope with this new type of risk. On the portfolio level, two major challenges hereby are the adequate modelling of the dependence structure among cyber losses and the lack of suitable data based on which the model is calibrated. The purpose of this article is to highlight the importance of taking a holistic approach to cyber. In particular, we argue that actuarial modelling should not be viewed stand-alone, but rather as an integral part of an interconnected value chain with other processes such as cyber-risk assessment and cyber-claims settlement. We illustrate that otherwise, i.e. if these data-collection processes are not aligned with the actuarial (dependence) model, naïve data collection necessarily leads to a dangerous underestimation of accumulation risk. We illustrate the detrimental effects on the assessment of the dependence structure and portfolio risk by using a simple mathematical model for dependence through common vulnerabilities. The study concludes by highlighting the practical implications for insurers.
Journal Article
The standard formula of Solvency II: a critical discussion
by
Stahl, Gerhard
,
Scherer, Matthias
in
Actuarial science
,
Applications of Mathematics
,
Balance sheets
2021
Establishing a standard formula (SF) for the regulation of European insurance companies is a Herculean task. It has to acknowledge very different business models and national peculiarities. In addition, regulatory authorities—as a stakeholder on their own—have a number of supervisory objectives the SF should incentivize. With the intervention of the SF in economic activities, the principle of equal treatment must be maintained. The large circle of users makes its procedural simplicity indispensable to ensure that it is applied and implemented in a proportionate manner. Above all, the SF should be risk-sensitive. Compared to Solvency I, the SF of Solvency II is considered a significant improvement, as many of the aforementioned desiderata have been much better realized. The following analysis and survey of model-theoretical aspects of the SF shows that these improvements could be achieved above all with regard to epistemic uncertainties. The stochastic model underneath the SF is still subject to considerable uncertainties; so that the probability functional of the SF is exposed to significant model risk. As part of the Own Risk and Solvency Assessment (ORSA), insurance companies must prove the adequacy of the SF for their company. The vague prior knowledge represented by the stochastic component of the SF is not sufficient for an SF intrinsic validation of the aleatoric component.
Journal Article
A neural network approach for the mortality analysis of multiple populations: a case study on data of the Italian population
by
Scherer, Matthias
,
Ungolo, Francesco
,
Euthum, Maximilian
in
Applications of Mathematics
,
Case studies
,
Case Study
2024
A Neural Network (NN) approach for the modelling of mortality rates in a multi-population framework is compared to three classical mortality models. The NN setup contains two instances of Recurrent NNs, including Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU) networks. The stochastic approaches comprise the Li and Lee model, the Common Age Effect model of Kleinow, and the model of Plat. All models are applied and compared in a large case study on decades of data of the Italian population as divided in counties. In this case study, a new index of multiple deprivation is introduced and used to classify all Italian counties based on socio-economic indicators, sourced from the local office of national statistics (ISTAT). The aforementioned models are then used to model and predict mortality rates of groups of different socio-economic characteristics, sex, and age.
Journal Article
Global Sensitivity Analysis of Economic Model Predictive Longitudinal Motion Control of a Battery Electric Vehicle
by
Braband, Matthias
,
Scherer, Matthias
,
Voos, Holger
in
Automobile industry
,
Economic analysis
,
Economic models
2022
Global warming forces the automotive industry to reduce real driving emissions and thus, its CO2 footprint. Besides maximizing the individual efficiency of powertrain components, there is also energy-saving potential in the choice of driving strategy. Many research works have noted the potential of model predictive control (MPC) methods to reduce energy consumption. However, this results in a complex control system with many parameters that affect the energy efficiency. Thus, an important question remains: how do these partially uncertain (system or controller) parameters influence the energy efficiency? In this article, a global variance-based sensitivity analysis method is used to answer this question. Therefore, a detailed powertrain model controlled by a longitudinal nonlinear MPC (NMPC) is developed and parameterized. Afterwards, a qualitative Morris screening is performed on this model, in order to reduce the parameter set. Subsequently, the remaining parameters are quantified using Generalized Sobol Indices, in order to take the time dependence of physical processes into account. This analysis reveals that the variations in vehicle mass, battery temperature, rolling resistance and auxiliary consumers have the greatest influence on the energy consumption. In contrast, the parameters of the NMPC only account for a maximum of 5% of the output variance.
Journal Article
Pricing corporate bonds in an arbitrary jump-diffusion model based on an improved Brownian-bridge algorithm
2011
We provide an efficient and unbiased Monte Carlo simulation for the computation of bond prices in a structural default model with jumps. The algorithm requires the evaluation of integrals with the density of the first-passage time of a Brownian bridge as the integrand. Metwally and Atiya suggest an approximation of these integrals. We improve this approximation in terms of precision. We show, from a modeling point of view, that a structural model with jumps is able to endogenously generate stochastic recovery rates. It is well known that allowing a sudden default by a jump results in a positive limit of credit spreads at the short end of the term structure. We provide an explicit formula for this limit, depending only on the Lévy measure of the logarithm of the firm-value process, the recovery rate and the distance to default. [PUBLICATION ABSTRACT]
Journal Article