Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
1,066 result(s) for "NATURAL CATASTROPHES"
Sort by:
The need for data: natural disasters and the challenges of database management
Hundreds of natural catastrophes occur worldwide every year—there were 780 loss events per year on average over the last 10 years. Since 1980, these disasters have claimed over two million lives and caused losses worth US$ 3,000 billion. The deadliest disasters were caused by earthquakes: the tsunami following the Sumatra quake (2004) and the Haiti earthquake (2010) claimed more than 220,000 lives each. The Great East Japan Earthquake of 11 March 2011 was the costliest natural disaster of all times, with total losses of US$ 210 billion. Hurricane Katrina, in 2005, was the second costliest disaster, with total losses of US$ 140 billion (in 2010 values). To ensure that high-quality natural disaster analyses can be performed, the data have to be collected, checked and managed with a high degree of expertise and professionality. Scientists, governmental and non-governmental organisations and the finance industry make use of global databases that contain losses attributable to natural catastrophes. At present, there are three global and multi-peril loss databases: NatCatSERVICE (Munich Re), Sigma (Swiss Re) and EM-Dat (Centre for Research on the Epidemiology of Disasters). They are supplemented by numerous databases focusing on national or regional issues, certain hazards and specific sectors. This paper outlines the criteria and definitions relating to how global and multi-peril databases are operated, and the efforts being made to ensure consistent and internationally recognised standards of data management. In addition, it presents the concept and methodology underlying the NatCatSERVICE database, and points out the many challenges associated with data acquisition and data management.
The ILS loss experience: natural catastrophe issues 2001–2020
“If there were no losses; there would be no premiums,” Insurance proverb. This paper analyzes the history of natural catastrophe Insurance-Linked Securities (ILS), or Cat Bonds (CB), from 2001 to 2020. Preliminary analyses summarize the annual character of issuance during that period, providing context for the principal focus of the paper, which is losses. A detailed loss record is provided, including why and when losses occurred. This record, when set against the historic issuance, allows us to address several important questions, unaddressed in the literature but constantly posed by practitioners. Does the cumulative loss over 20 years equal what catastrophe models led to us expect? Were the relative sizes of actual losses reflective of expected losses? Most importantly, does the loss record support the idea that natural catastrophe models are accurate and useful? This paper is the first to specifically address these fundamental forensic questions against the loss record. It thereby makes an important contribution to the growing literature about ILS markets.
Parameter estimation of the Pareto distribution using least squares approaches blended with different rank methods and its applications in modeling natural catastrophes
The current article evaluates least-squares-based approaches for estimating parameters of the two-parameter Pareto distribution. The algebraic expressions for least squares (LS), relative least squares (RLS) and weighted least squares (WLS) estimators are derived by generating empirical cumulative distribution function (CDF) using mean rank, median rank and symmetrical CDF methods. The performance of the estimation approaches is evaluated through Monte Carlo simulations for different combinations of parameter values and sample sizes. The performance of the regression-based methods is then compared with one another and with the traditional maximum likelihood (ML) estimation method. Our simulation results unveil that among the regression-based methods, RLS has an improved or better performance compared to the other two regression-based approaches for samples of all sizes. Moreover, RLS performs better than the ML method for small samples. Among the rank methods used for generating empirical CDF, it is observed that the mean rank method outperformed other two rank methods. The simulation results are further corroborated by the application of all the methods on two real-life datasets representing damages caused by natural catastrophes.
Drought, flood, fire : how climate change contributes to catastrophes
\"Every year, droughts, floods, and fires impact hundreds of millions of people and cause massive economic losses. Climate change is making these catastrophes more dangerous. Now. Not in the future: NOW. This book describes how and why climate change is already fomenting dire consequences, and will certainly make climate disasters worse in the near future. Chris Funk combines the latest science with compelling stories, providing a timely, accessible, and beautifully-written synopsis of this critical topic. The book describes our unique and fragile Earth system, and the negative impacts humans are having on our support systems. It then examines recent disasters, including heat waves, extreme precipitation, hurricanes, fires, El Niños and La Niñas, and their human consequences. By clearly describing the dangerous impacts that are already occurring, Funk provides a clarion call for social change, yet also conveys the beauty and wonder of our planet, and hope for our collective future\"-- Provided by publisher.
Natural hazard and disaster tourism
An observed trend, which can be defined as tourist interest in natural hazards and disasters, has persuaded the authors to attempt to research several issues, including tourist motivations and specific tourism properties and functions of this form of activity. The objective also covered the allocation of this social and natural process in the general structure of tourism. This interest has a long history, and a new stage is currently forming, which partly results from factors affecting society, such as information and education, which provoke antagonistic reactions. Extreme natural phenomena entail a common reduction of tourist interest in the destination which hosted the event; however, it never drops to zero. Differences are visible depending on the type of phenomenon. On the other hand, natural hazards and disasters are considered to hold a specific tourism value. This article discusses the allocation of this human activity in the tourism forms known to scientists, accounting for its diversity and relating to ethics.
Over the seawall : tsunamis, cyclones, drought, and the delusion of controlling nature
\"As extreme weather becomes more common, the urge to outwit nature can be irresistible. But when our expensive technosolutions backfire, are we worse off than before? How should we adapt to a changing climate? Miller reveals the unintended consequences of bad adaptations or as academics call it, maladaptations--fixes that do more harm than good. From seawalls in coastal Japan, to the reengineered waters in the Ganges River Delta, to the artificial ribbon of water supporting both farms and urban centers in parched Arizona, the author traces the histories of engineering marvels that were once deemed too smart and too big to fail. In each he takes us into the land and culture, seeking out locals and experts to better understand how complicated, grandiose schemes led instead to failure, and to find answers to the technologic holes we've dug ourselves into. Miller urges us to take a hard look at the fortifications we build and how they've fared in the past. He embraces humanity's penchant for problem-solving, but argues that if we are to adapt successfully to climate change, we must recognize that working with nature is not surrender but the only way to assure a secure future.\"--From publisher's description.
A resimulation framework for event loss tables based on clustering
Catastrophe loss modeling has enormous relevance for various insurance companies due to the huge loss potential. In practice, geophysical-meteorological models are widely used to model these risks. These models are based on the simulation of meteorological and physical parameters that cause natural events and evaluate the corresponding effects on the insured exposure of a certain company. Due to their complexity, these models are often operated by external providers—at least seen from the perspective of a variety of insurance companies. The outputs of these models can be made available, for example, in the form of event loss tables, which contain different statistical characteristics of the simulated events and their caused losses relative to the exposure. The integration of these outputs into the internal risk model framework is fundamental for a consistent treatment of risks within the companies. The main subject of this work is the formulation of a performant resimulation algorithm of given event loss tables, which can be used for this integration task. The newly stated algorithm is based on cluster analysis techniques and represents a time-efficient way to perform sensitivities and scenario analyses.