Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
68 result(s) for "Kearns, Edward"
Sort by:
Exposure of the US population to extreme precipitation risk has increased due to climate change
The magnitude and frequency of extreme precipitation events in the early twenty-first century have already proven to be increasing at a rate more quickly than previously anticipated. Currently, the biggest consequence of the change in extreme precipitation is the lack of a climate-adjusted national standard taking into account these recent increases that could be used to prevent life and property loss from catastrophic precipitation-driven floods. Here, we address how severe the change in extreme precipitation compares against the current national standard for precipitation climatology (NOAA Atlas 14) and how much of the population is affected by the underestimation of this risk in the contiguous United States (CONUS). As a result, extreme precipitation in the early twenty-first century has outpaced our current national standard in half of CONUS, and the heavy precipitation events experienced recently are quickly becoming a “new normal”, which will increase in severity and frequency in a continually changing climate. Over three-quarters of the U.S. population will likely experience this new normal occurrence of extreme precipitation. As much as one-third of the population is expected to experience the current definition of a 1-in-100-year storm as often as three times in their lifetime. Additionally, the current precipitation standards for designing transportation infrastructure and urban stormwater drainage systems that are built upon Atlas 14 may be insufficient to protect the public's safety and personal/community property from severe flooding. Areas where flood risk is mitigated by operating hydraulic and adaptation structures urgently need to assess the impact of the increased-hourly extreme precipitation and reevaluate their applicable operation rules. Understanding and predicting patterns and the likelihood of short-duration heavy precipitation would be beneficial in preparing for severe precipitation-driven disasters, such as flash floods and landslides, which would happen more frequently in a changing climate. Following the results of this analysis, accelerating the development and dissemination of the next generation of the national standard that has been climatically adjusted to adapt to the new normal is strongly recommended.
Integrating climate change induced flood risk into future population projections
Flood exposure has been linked to shifts in population sizes and composition. Traditionally, these changes have been observed at a local level providing insight to local dynamics but not general trends, or at a coarse resolution that does not capture localized shifts. Using historic flood data between 2000-2023 across the Contiguous United States (CONUS), we identify the relationships between flood exposure and population change. We demonstrate that observed declines in population are statistically associated with higher levels of historic flood exposure, which may be subsequently coupled with future population projections. Several locations have already begun to see population responses to observed flood exposure and are forecasted to have decreased future growth rates as a result. Finally, we find that exposure to high frequency flooding (5 and 20-year return periods) results in 2-7% lower growth rates than baseline projections. This is exacerbated in areas with relatively high exposure to frequent flooding where growth is expected to decline over the next 30 years. Using historical data across the U.S., the authors find that population declines are associated with flood exposure. Projecting this relationship to 2053, the authors find that flood risk may result in 7% lower growth than otherwise expected.
A Coupled Wildfire-Emission and Dispersion Framework for Probabilistic PM2.5 Estimation
Accurate representation of fire emissions and smoke transport is crucial for current and future wildfire-smoke projections. We present a flexible modeling framework for emissions sourced from the First Street Foundation Wildfire Model (FSF-WFM) to provide a national map for near-surface smoke conditions exceeding the threshold for unhealthy concentrations of particulate matter at or less than 2.5 µm, or PM2.5. Smoke yield from simulated fires is converted to emissions transported by the National Oceanic and Atmospheric Administration’s HYSPLIT model. We present a strategy for sampling from a simulation of ~65 million individual fires, to depict the occurrence of “unhealthy smoke days” defined as 24-h average PM2.5 concentration greater than 35.4 µg/m3 from HYSPLIT. The comparison with historical smoke simulations finds reasonable agreement using only a small subset of simulated fires. The total amount of PM2.5 mass-released threshold of 1015 µg was found to be effective for simulating the occurrence of unhealthy days without significant computational burden.
The Construction of Probabilistic Wildfire Risk Estimates for Individual Real Estate Parcels for the Contiguous United States
The methodology used by the First Street Foundation Wildfire Model (FSF-WFM) to compute estimates of the 30-year, climate-adjusted aggregate wildfire hazard for the contiguous United States at 30 m horizontal resolution is presented. The FSF-WFM integrates several existing methods from the wildfire science community and implements computationally efficient and scalable modeling techniques to allow for new high-resolution, CONUS-wide hazard generation. Burn probability, flame length, and ember spread for the years 2022 and 2052 are computed from two ten-year representative Monte Carlo simulations of wildfire behavior, utilizing augmented LANDFIRE fuel estimates updated with all the available disturbance information. FSF-WFM utilizes ELMFIRE, an open-source, Rothermel-based wildfire behavior model, and multiple US Federal Government open data sources to drive the simulations. LANDFIRE non-burnable fuel classes within the wildland–urban interface (WUI) are replaced with fuel estimates from machine-learning models, trained on data from historical fires, to allow the propagation of wildfire through the WUI in the model. Historical wildfire ignition locations and NOAA’s hourly time series of surface weather at 2.5 km resolution are used to drive ELMFIRE to produce wildfire hazards representative of the 2022 and 2052 conditions at 30 m resolution, with the future weather conditions scaled to the IPCC CMIP5 RCP4.5 model ensemble predictions. Winds and vegetation were held constant between the 2022 and 2052 simulations, and climate change’s impacts on the future fuel conditions are the main contributors to the changes observed in the 2052 results. Non-zero wildfire exposure is estimated for 71.8 million out of 140 million properties across CONUS. Climate change impacts add another 11% properties to this non-zero exposure class over the next 30 years, with much of this change observed in the forested areas east of the Mississippi River. “Major” aggregate wildfire exposure of greater than 6% over the 30-year analysis period from 2022 to 2052 is estimated for 10.2 million properties. The FSF-WFM represents a notable contribution to the ability to produce property-specific, climate-adjusted wildfire risk assessments in the US.
High-Resolution Estimation of Monthly Air Temperature from Joint Modeling of In Situ Measurements and Gridded Temperature Data
Surface air temperature is an important variable in quantifying extreme heat, but high-resolution temporal and spatial measurement is limited by sparse climate-data stations. As a result, hyperlocal models of extreme heat involve intensive physical data collection efforts or analyze satellite-derived land-surface temperature instead. We developed a geostatistical model that integrates in situ climate-quality temperature records, gridded temperature data, land-surface temperature estimates, and spatially consistent covariates to predict monthly averaged daily maximum surface-air temperatures at spatial resolutions up to 30 m. We trained and validated the model using data from North Carolina. The fitted model showed strong predictive performance with a mean absolute error of 1.61 ∘F across all summer months and a correlation coefficient of 0.75 against an independent hyperlocal temperature model for the city of Durham. We show that the proposed model framework is highly scalable and capable of producing realistic temperature fields across a variety of physiographic settings, even in areas where no climate-quality data stations are available.
A Coupled Wildfire-Emission and Dispersion Framework for Probabilistic PMsub.2.5 Estimation
Accurate representation of fire emissions and smoke transport is crucial for current and future wildfire-smoke projections. We present a flexible modeling framework for emissions sourced from the First Street Foundation Wildfire Model (FSF-WFM) to provide a national map for near-surface smoke conditions exceeding the threshold for unhealthy concentrations of particulate matter at or less than 2.5 µm, or PM[sub.2.5] . Smoke yield from simulated fires is converted to emissions transported by the National Oceanic and Atmospheric Administration’s HYSPLIT model. We present a strategy for sampling from a simulation of ~65 million individual fires, to depict the occurrence of “unhealthy smoke days” defined as 24-h average PM[sub.2.5] concentration greater than 35.4 µg/m[sup.3] from HYSPLIT. The comparison with historical smoke simulations finds reasonable agreement using only a small subset of simulated fires. The total amount of PM[sub.2.5] mass-released threshold of 10[sup.15] µg was found to be effective for simulating the occurrence of unhealthy days without significant computational burden.
SUSTAINED PRODUCTION OF MULTIDECADAL CLIMATE RECORDS
The key objective of the NOAA Climate Data Record (CDR) program is the sustained production of high-quality, multidecadal time series data describing the global atmosphere, oceans, and land surface that can be used for informed decision-making. The challenges of a long-term program of sustaining CDRs, as contrasted with short-term efforts of traditional 3-yr research programs, are substantial. The sustained production of CDRs requires collaboration between experts in the climate community, data management, and software development and maintenance. It is also informed by scientific application and associated user feedback on the accessibility and usability of the produced CDRs. The CDR program has developed a metric for assessing the maturity of CDRs with respect to data management, software, and user application and applied it to over 30 CDRs. The main lesson learned over the past 7 years is that a rigorous team approach to data management, employing subject matter experts at every step, is critical to open and transparent production. This approach also makes it much easier to support the needs of users who want near-real-time production of CDRs for monitoring and users who want to use CDRs for tailored, derived information, such as a drought index.
An Independent Assessment of Pathfinder AVHRR Sea Surface Temperature Accuracy Using the Marine Atmosphere Emitted Radiance Interferometer (MAERI)
The remotely sensed sea surface temperature (SST) estimated from the 4-km-resolution Pathfinder SST algorithm is compared to a SST locally measured by the Marine Atmospheric Emitted Radiance Interferometer (MAERI) during five oceanographic cruises in the Atlantic and Pacific Oceans, in conditions ranging from Arctic to equatorial. The Pathfinder SST is a product of the satellite-based Advanced Very High Resolution Radiometer, while the MAERI is an infrared radiometric interferometer with continuous onboard calibration that can provide highly accurate (better than 0.05°C) in situ skin temperatures during extended shipboard deployments. Matchups, which are collocated (within 4 km) and coincident (±40 min during the day; ±120 min during the night) data, from these two different sources under cloud-free conditions are compared. The average difference between the MAERI and Pathfinder SSTs is found to be 0.07 ±0.31°C from 219 matchups during the low- and midlatitude cruises; inclusion of 80 more matchups from the Arctic comparisons produces an average global difference of 0.14 ±0.36°C. The MAERI–Pathfinder differences compare favorably with the average midlatitude differences between the MAERI skin SST and other bulk SST estimates commonly available for these cruises such as the research vessels’ thermosalinograph SST (0.12 ±0.17°C) and the weekly National Centers for Environmental Prediction optimally interpolated SST analysis (0.41 ±0.58°C). While not representative of all possible oceanic and atmospheric regimes, the accuracy of the Pathfinder SST estimates under the conditions sampled by the five cruises is found to be at least twice as good as previously demonstrated.
UNLOCKING THE POTENTIAL OF NEXRAD DATA THROUGH NOAA’S BIG DATA PARTNERSHIP
The National Oceanic and Atmospheric Administration’s (NOAA) Big Data Partnership (BDP) was established in April 2015 through cooperative research agreements between NOAA and selected commercial and academic partners. The BDP is investigating how the value inherent in NOAA’s data may be leveraged to broaden their utilization through modern cloud infrastructures and advanced “big data” techniques. NOAA’s Next Generation Weather Radar (NEXRAD) data were identified as an ideal candidate for such collaborative efforts. NEXRAD Level II data are valuable yet challenging to utilize in their entirety, and recent advances in weather radar science can be applied to both the archived and realtime data streams. NOAA’s National Centers for Environmental Information (NCEI) transferred the complete NEXRAD Level II historical archive, originating in 1991, through North Carolina State University’s Cooperative Institute for Climate and Satellites (CICS-NC) to interested BDP collaborators. Amazon Web Services (AWS) has received and made freely available the complete archived Level II data through its AWS platform. AWS then partnered with Unidata/University Corporation for Atmospheric Research (UCAR) to establish a real-time NEXRAD feed, thereby providing on-demand dissemination of both archived and current data seamlessly through the same access mechanism by October 2015. To organize, verify, and utilize the NEXRAD data on its platform, AWS further partnered with the Climate Corporation. This collective effort among federal government, private industry, and academia has already realized a number of new and novel applications that employ NOAA’s NEXRAD data, at no net cost to the U.S. taxpayer. The volume of accessed NEXRAD data, including this new AWS platform service, has increased by 130%, while the amount of data delivered by NOAA/NCEI has decreased by 50%.
Mauclair and the Musical World of the \Fin de Siècle\ and the \Belle Époque\
Mauclair, encouraged by Mallarmé in 1891, entered young upon the career of an industrious écrivain. A critic in all areas of the Arts, he is most remembered in the Fine Arts. First novelist, then 'Schumannian' poet, and biographer of Schumann, he gave considerable attention, from 1903 to 1919, to the contemporary musical world, with a \"Histoire\" (from 1850), two books of Essais, and many articles, along with, after 1919, reminiscences in two books of memoirs. Mauclair's predilection is for the fortunes of Wagner, Debussy and Franck. He avows a standpoint of collective humanitarianism resembling \"l'Altruisme\" of René Ghil, and one that is conservative in a way attuned to much belle époque thought.