Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
1,181
result(s) for
"Taylor, Greg"
Sort by:
Chain Ladder Under Aggregation of Calendar Periods
2025
The chain ladder model is defined by a set of assumptions about the claim array to which it is applied. It is, in practice, applied to claim arrays whose data relate to different frequencies, e.g., yearly, quarterly, monthly, weekly, etc. There is sometimes a tacit assumption that one can shift between these frequencies at will, and that the model will remain applicable. It is not obvious that this is the case. One needs to check whether a model whose assumptions hold for annual data will continue to hold for a quarterly (for example) representation of the same data. The present paper studies this question in the case of preservation of calendar periods, i.e., (in the example) annual calendar periods are dissected into quarters. The study covers the two most common forms of chain ladder model, namely the Tweedie chain ladder and Mack chain ladder. The conclusion is broadly, if not absolutely, negative. Certain parameter sets can indeed be found for which the chain ladder structure is maintained under a change in data frequency. However, while it may be technically possible to maintain the chain ladder model under such a change to the data, it is not possible in any reasonable, practical sense.
Journal Article
The Mack Chain Ladder and Data Granularity for Preserved Development Periods
2025
This paper is concerned with the choice of data granularity for the application of the Mack chain ladder model to forecast a loss reserve. It is a sequel to a related paper by Taylor, which considers the same question for the EDF chain ladder model. As in the earlier paper, it considers the question as to whether a decrease in the time unit leads to an increase or decrease in the variance of the loss reserve estimate. The question of whether a Mack chain ladder that is valid for one time unit (here called mesh size) remains so for another is investigated. The conditions under which the model does remain valid are established. There are various ways in which the mesh size of a data triangle may be varied, two of them of particular interest. The paper examines one of these, namely that in which development periods are preserved. Two versions of this are investigated: 1. the aggregation of development periods without change to accident periods; 2. the aggregation of accident periods without change to development periods. Taylor found that, in the case of the Poisson chain ladder, an increase in mesh size always increases the variance of the loss reserve estimate (subject to mild technical conditions). The case of the Mack chain ladder is more nuanced in that an increase in variance is not always guaranteed. Whether or not an increase or decrease occurs depends on the numerical values of certain of the age-to-age factors actually observed. The threshold values of the age-to-age factors at which an increase transitions to a decrease in variance are calculated. In the case of a change in the mesh of development periods, but with no change to accident periods, these values are computed for one particular data set, where it is found that variance always increases. It is conjectured that data sets in which this does not happen would be relatively rare. The situation is somewhat different when changes in mesh size over accident periods are considered. Here, the question of an increase or decrease in variance is more complex, and, in general terms, the occurrence of an increase in variance with increased mesh size is less likely.
Journal Article
The Exponential Dispersion Family (EDF) Chain Ladder and Data Granularity
2025
This paper is concerned with the choice of data granularity for application of the EDF (Exponential Dispersion Family) chain ladder model to forecast a loss reserve. As the duration of individual accident and development periods is decreased, the number of data points increases, but the volatility of each point increases. This leads to a question as to whether a decrease in time unit leads to an increase or decrease in the variance of the loss reserve estimate. Is there an optimal granularity with respect to the variance of the loss reserve? A preliminary question is that of whether an EDF chain ladder that is valid for one duration (here called mesh size) remains so for another. The conditions under which this is so are established. There are various ways in which the mesh size of a data triangle may be varied. The paper identifies two of particular interest. For each of these two types of variation, the effect on variance of loss reserve is studied. Subject to some technical qualifications, the conclusion is that an increase in mesh size always increases the variance. It follows that one should choose a very high degree of granularity in order to maximize efficiency of loss reserve forecasting.
Journal Article
Loss reserving models: Granular and machine learning forms
2019
The purpose of this paper is to survey recent developments in granular models and machine learning models for loss reserving, and to compare the two families with a view to assessment of their potential for future development. This is best understood against the context of the evolution of these models from their predecessors, and the early sections recount relevant archaeological vignettes from the history of loss reserving. However, the larger part of the paper is concerned with the granular models and machine learning models. Their relative merits are discussed, as are the factors governing the choice between them and the older, more primitive models. Concluding sections briefly consider the possible further development of these models in the future.
Journal Article
A model of biased intermediation
2019
We study situations in which consumers rely on a biased intermediary s advice when choosing among sellers. We introduce the notion that sellers' and consumers' payoffs can be congruent or conflicting, and show that this has important implications for the effects of bias. Under congruence, the firm benefiting from bias has an incentive to offer a better deal than its rival and consumers can be better-off than under no bias. Under conflict, the favored firm offers lower utility, and bias harms consumers. We study various policies for dealing with bias and show that their efficacy also depends on whether the payoffs exhibit congruence or conflict.
Journal Article
Lightning Interferometry with the Long Wavelength Array
2023
The Long Wavelength Array is a radio telescope array located at the Sevilleta National Wildlife Refuge in La Joya, New Mexico, well suited and situated for the observation of lightning. The array consists of 256 high-sensitivity dual polarization antennas arranged in a 100 m diameter. This paper demonstrates some of the capabilities that the array brings to the study of lightning. Once 32 or more antennas are used to image lightning radio sources, virtually every integration period longer than the impulse response of the array includes at least one identifiable lightning emitter, independent of the integration period used. The use of many antennas also allows multiple simultaneous lightning radio sources to be imaged at sub-microsecond timescales; for the flash examined, 51% of the images contained more than one lightning source. Finally, by using many antennas to image lightning sources, the array is capable of locating sources fainter than the galactic background radio noise level, yielding possibly the most sensitive radio maps of lightning to date. This incredible sensitivity enables, for the first time, the emissions originating from the positive leader tips of natural in-cloud lightning to be detected and located. The tip emission is distinctly different from needle emission and is most likely due to positive breakdown.
Journal Article
The three Queenslands - Sir Samuel Griffith's 'ghost' draft for a Queensland federation
2020
From 1890 to 1892, Sir Samuel Griffith, as Premier of Queensland, promoted a scheme under which Queensland would itself have been divided into a federation of initially three provinces - North, Central and South Queensland - and then two provinces, North and South Queensland. This startling idea would certainly have changed the map of Australia, probably permanently. At least at some points, the idea was expressed that each province would enter the Australian federation as a separate State and the Queensland federal government would simply be dissolved upon federation. The Bill to divide Queensland into a federation of two provinces passed the lower House of State Parliament but was defeated in the nominee Legislative Council. It then fell victim to the change of government consequent upon Griffith's appointment as Chief Justice of Queensland, to the urgent problems presented by the economic depression, and even, from the conservative point of view, to the rise of labour in politics. Little has been known about this nearly successful plan until now. This article attempts to close that gap.
Journal Article
Risks special issue on \granular models and machine learning models\
2020
It is probably fair to date loss reserving by means of claim modelling from the late 1960s [...]
Journal Article
Using a Field Tracer Study to Calibrate a Water Quality Model for Disinfection By-Product Formation Potential and Chlorine Decay
2023
Several years ago, the Orlando Utilities Commission (OUC) performed a tracer test to help determine the age of water in its distribution system. The OUC wanted definitive data on water quality - specifically disinfectant by-product formation and chlorine decay - at precise points in the distribution system. The field tracer study, when calibrated with OUC's laboratory modeling, allowed the utility to more accurately determine water age and quality throughout its system, including chlorine residual and disinfectant by-product concentration (specifically trihalomethanes and haloacetic acids). Though the test was conducted in 2003, the data are still in use by OUC, and have been confirmed with periodic water quality checks throughout the years.
Journal Article