Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
166 result(s) for "Whitaker, Jeffrey S."
Sort by:
Evaluating Methods to Account for System Errors in Ensemble Data Assimilation
Inflation of ensemble perturbations is employed in ensemble Kalman filters to account for unrepresented error sources. The authors propose a multiplicative inflation algorithm that inflates the posterior ensemble in proportion to the amount that observations reduce the ensemble spread, resulting in more inflation in regions of dense observations. This is justified since the posterior ensemble variance is more affected by sampling errors in these regions. The algorithm is similar to the “relaxation to prior” algorithm proposed by Zhang et al., but it relaxes the posterior ensemble spread back to the prior instead of the posterior ensemble perturbations. The new inflation algorithm is compared to the method of Zhang et al. and simple constant covariance inflation using a two-level primitive equation model in an environment that includes model error. The new method performs somewhat better, although the method of Zhang et al. produces more balanced analyses whose ensemble spread grows faster. Combining the new multiplicative inflation algorithm with additive inflation is found to be superior to either of the methods used separately. Tests with large and small ensembles, with and without model error, suggest that multiplicative inflation is better suited to account for unrepresented observation-network-dependent assimilation errors such as sampling error, while model errors, which do not depend on the observing network, are better treated by additive inflation. A combination of additive and multiplicative inflation can provide a baseline for evaluating more sophisticated stochastic treatments of unrepresented background errors. This is demonstrated by comparing the performance of a stochastic kinetic energy backscatter scheme with additive inflation as a parameterization of model error.
A Four-Dimensional Incremental Analysis Update for the Ensemble Kalman Filter
The analysis produced by the ensemble Kalman filter (EnKF) may be dynamically inconsistent and contain unbalanced gravity waves that are absent in the real atmosphere. These imbalances can be exacerbated by covariance localization and inflation. One strategy to combat the imbalance in the analyses is the incremental analysis update (IAU), which uses the dynamic model to distribute the analyses increments over a time window. The IAU has been widely used in atmospheric and oceanic applications. However, the analysis increment that is gradually introduced during a model integration is often computed once and assumed to be constant for an assimilation window, which can be seen as a three-dimensional IAU (3DIAU). Thus, the propagation of the analysis increment in the assimilation window is neglected, yet this propagation may be important, especially for moving weather systems. To take into account the propagation of the analysis increment during an assimilation window, a four-dimensional IAU (4DIAU) used with the EnKF is presented. It constructs time-varying analysis increments by applying all observations in an assimilation window to state variables at different times during the assimilation window. It then gradually applies these time-varying analysis increments through the assimilation window. Results from a dry two-layer primitive equation model and the NCEP GFS show that EnKF with 4DIAU (EnKF-4DIAU) and 3DIAU (EnKF-3DIAU) reduce imbalances in the analysis compared to EnKF without initialization (EnKF-RAW). EnKF-4DIAU retains the time-varying information in the analysis increments better than EnKF-3DIAU, and produces better analysis and forecast than either EnKF-RAW or EnKF-3DIAU.
Improving Assimilation of Radiance Observations by Implementing Model Space Localization in an Ensemble Kalman Filter
Experiments using the National Oceanic and Atmospheric Administration Finite‐Volume Cubed‐Sphere Dynamical Core Global Forecasting System (FV3GFS) reveal that the four‐dimensional ensemble‐variational method (4DEnVAR) performs similarly to an ensemble Kalman filter (EnKF) when no radiance observations are assimilated, but 4DEnVAR is superior to an EnKF when radiance observations are assimilated. The hypothesis for the cause of the differences between 4DEnVAR and EnKF is the difference in vertical localization, since radiance observations are integral observations in the vertical and 4DEnVAR uses model space localization while the EnKF uses observation space localization. A modulation approach, which generates an expanded ensemble from the raw ensemble and eigenvectors of the localization matrix, has been adopted to implement model space localization in the operational National Oceanic and Atmospheric Administration EnKF. As constructed, the expanded ensemble is a square root of the vertically localized background error covariance matrix, so no explicit vertical localization is necessary during the EnKF update. The size of the expanded ensemble is proportional to the rank of the vertical localization matrix—for a vertical localization scale of 1.5 (3.0) scale heights, 12 (7) eigenvectors explain 96% of the variance of the localization matrix, so the expanded ensemble is 12 (7) times larger than the raw ensemble. Results from assimilating only radiance observations in the FV3GFS model confirm that EnKF with model‐space vertical localization performs better than observation‐space localization, and produces results similar to 4DEnVAR. Moreover, a 960‐member ensemble is sufficient to turn off the vertical localization entirely and yields significant improvements comparing to an 80‐member ensemble with model space localization. Key Points A modulation approach has been adopted to implement model‐space localization in an ensemble Kalman filter EnKF with model‐space vertical localization performs better than observation‐space localization A 960‐member ensemble is sufficient to turn off the vertical localization and yields significant improvements than an 80‐member ensemble
NOAA’S SECOND-GENERATION GLOBAL MEDIUM-RANGE ENSEMBLE REFORECAST DATASET
A multidecadal ensemble reforecast database is now available that is approximately consistent with the operational 0000 UTC cycle of the 2012 NOAA Global Ensemble Forecast System (GEFS). The reforecast dataset consists of an 11-member ensemble run once each day from 0000 UTC initial conditions. Reforecasts are run to +16 days. As with the operational 2012 GEFS, the reforecast is run at T254L42 resolution (approximately 1/2° grid spacing, 42 levels) for week +1 forecasts and T190L42 (approximately 3/4° grid spacing) for the week +2 forecasts. Reforecasts were initialized with Climate Forecast System Reanalysis initial conditions, and perturbations were generated using the ensemble transform with rescaling technique. Reforecast data are available from 1985 to present. Reforecast datasets were previously demonstrated to be very valuable for detecting and correcting systematic errors in forecasts, especially forecasts of relatively rare events and longer-lead forecasts. What is novel about this reforecast dataset relative to the first-generation NOAA reforecast is that (i) a modern, currently operational version of the forecast model is used (the previous reforecast used a model version from 1998); (ii) a much larger set of output data has been saved, including variables relevant for precipitation, hydrologic, wind energy, solar energy, severe weather, and tropical cyclone forecasting; and (iii) the archived data are at much higher resolution. The article describes more about the reforecast configuration and provides a few examples of how this second-generation reforecast data may be used for research and a variety of weather forecast applications.
Diagnosis of the Source of GFS Medium-Range Track Errors in Hurricane Sandy (2012)
Medium-range forecasts of Hurricane Sandy’s track were characterized by widely diverging solutions, with some suggesting that Sandy would make landfall over the mid-Atlantic region of the United States, while others forecast the storm to move due east to the north of Bermuda. Here, dynamical processes responsible for the eastward-tracking forecasts are diagnosed using an 80-member ensemble of experimental Global Forecast System (GFS) forecasts initialized five days prior to landfall. Comparing the ensemble members with tracks to the east against those with tracks to the west indicates that the eastern members were characterized by a lower-amplitude upper-tropospheric anticyclone on the poleward side of Sandy during the first 24 h of the forecast, which in turn was associated with a westerly perturbation steering wind. The amplification of this ridge in each set of members was modulated by differences in the advection of potential vorticity (PV) by the irrotational wind associated with Sandy’s secondary circulation and isentropic lift along a warm front that formed on the poleward side of Sandy. The amplitude of the irrotational wind in this region was proportional to the 0-h water vapor mixing ratio, and to a lesser extent the 0-h upper-tropospheric horizontal divergence. These two quantities modulated the vertical profile of grid-scale condensation within the model and subsequent upper-tropospheric divergence. The results from this study suggest that additional observations within regions of large-scale precipitation outside the tropical cyclone (TC) core could benefit TC track forecasts, particularly when the TC is located near an upper-tropospheric PV gradient.
Correcting Systematic and State‐Dependent Errors in the NOAA FV3‐GFS Using Neural Networks
Weather forecasts made with imperfect models contain state‐dependent errors. Data assimilation (DA) partially corrects these errors with new information from observations. As such, the corrections, or “analysis increments,” produced by the DA process embed information about model errors. An attempt is made here to extract that information to improve numerical weather prediction. Neural networks (NNs) are trained to predict corrections to the systematic error in the National Oceanic and Atmospheric Administration's FV3‐GFS model based on a large set of analysis increments. A simple NN focusing on an atmospheric column significantly improves the estimated model error correction relative to a linear baseline. Leveraging large‐scale horizontal flow conditions using a convolutional NN, when compared to the simple column‐oriented NN, does not improve skill in correcting model error. The sensitivity of model error correction to forecast inputs is highly localized by vertical level and by meteorological variable, and the error characteristics vary across vertical levels. Once trained, the NNs are used to apply an online correction to the forecast during model integration. Improvements are evaluated both within a cycled DA system and across a collection of 10‐day forecasts. It is found that applying state‐dependent NN‐predicted corrections to the model forecast improves the overall quality of DA and improves the 10‐day forecast skill at all lead times. Plain Language Summary Computer models used for operational weather prediction are not perfect—they are naturally only simplifications of the true atmosphere. Such imperfections result in reduced forecast quality. Weather forecast systems routinely correct the forecasts by pulling them closer to observations, thus providing some information about the errors present in the forecast model. Here, a neural network (NN) is trained to correct National Oceanic and Atmospheric Administration's operational weather forecast model, FV3‐GFS, by “learning” the relation between the forecasts and the estimated model errors. The learned NN correction is then fed back into the weather model to improve the quality of the best guess state of the atmosphere and the subsequent 10‐day forecasts. By analyzing how the NN output depends on its input forecast, we gain some insight about the model errors, which may be helpful for future atmospheric model development and improvements to future error‐correcting NNs. Key Points A neural network (NN) trained to infer analysis increments from model forecasts learns to correct systematic errors in the FV3‐GFS model Sensitivity analysis of the NN reveals physically consistent error characteristics that may be used to improve the NN architecture Applying online corrections from NN improves the accuracy of sequential data assimilation and extended free forecasts
A GSI-Based Coupled EnSRF–En3DVar Hybrid Data Assimilation System for the Operational Rapid Refresh Model: Tests at a Reduced Resolution
A coupled ensemble square root filter–three-dimensional ensemble-variational hybrid (EnSRF–En3DVar) data assimilation (DA) system is developed for the operational Rapid Refresh (RAP) forecasting system. The En3DVar hybrid system employs the extended control variable method, and is built on the NCEP operational gridpoint statistical interpolation (GSI) three-dimensional variational data assimilation (3DVar) framework. It is coupled with an EnSRF system for RAP, which provides ensemble perturbations. Recursive filters (RF) are used to localize ensemble covariance in both horizontal and vertical within the En3DVar. The coupled En3DVar hybrid system is evaluated with 3-h cycles over a 9-day period with active convection. All conventional observations used by operational RAP are included. The En3DVar hybrid system is run at ⅓ of the operational RAP horizontal resolution or about 40-km grid spacing, and its performance is compared to parallel GSI 3DVar and EnSRF runs using the same datasets and resolution. Short-term forecasts initialized from the 3-hourly analyses are verified against sounding and surface observations. When using equally weighted static and ensemble background error covariances and 40 ensemble members, the En3DVar hybrid system outperforms the corresponding GSI 3DVar and EnSRF. When the recursive filter coefficients are tuned to achieve a similar height-dependent localization as in the EnSRF, the En3DVar results using pure ensemble covariance are close to EnSRF. Two-way coupling between EnSRF and En3DVar did not produce noticeable improvement over one-way coupling. Downscaled precipitation forecast skill on the 13-km RAP grid from the En3DVar hybrid is better than those from GSI 3DVar analyses.
FEASIBILITY OF A 100-YEAR REANALYSIS USING ONLY SURFACE PRESSURE DATA
Climate variability and global change studies are increasingly focused on understanding and predicting regional changes of daily weather statistics. Assessing the evidence for such variations over the last 100 yr requires a daily tropospheric circulation dataset. The only dataset available for the early twentieth century consists of error-ridden hand-drawn analyses of the mean sea level pressure field over the Northern Hemisphere. Modern data assimilation systems have the potential to improve upon these maps, but prior to 1948, few digitized upper-air sounding observations are available for such a “reanalysis.” We investigate the possibility that the additional number of newly recovered surface pressure observations is sufficient to generate useful weather maps of the lower-tropospheric extratropical circulation back to 1890 over the Northern Hemisphere, and back to 1930 over the Southern Hemisphere. Surprisingly, we find that by using an advanced data assimilation system based on an ensemble Kalman filter, it would be feasible to produce high-quality maps of even the upper troposphere using only surface pressure observations. For the beginning of the twentieth century, the errors of such upper-air circulation maps over the Northern Hemisphere in winter would be comparable to the 2–3-day errors of modern weather forecasts.
Predictions of 2010’s Tropical Cyclones Using the GFS and Ensemble-Based Data Assimilation Methods
Experimental ensemble predictions of tropical cyclone (TC) tracks from the ensemble Kalman filter (EnKF) using the Global Forecast System (GFS) model were recently validated for the 2009 Northern Hemisphere hurricane season by Hamill et al. A similar suite of tests is described here for the 2010 season. Two major changes were made this season: 1) a reduction in the resolution of the GFS model, from 2009’s T384L64 (~31 km at 25°N) to 2010’s T254L64 (~47 km at 25°N), and some changes in model physics; and 2) the addition of a limited test of deterministic forecasts initialized from a hybrid three-dimensional variational data assimilation (3D-Var)/EnKF method. The GFS/EnKF ensembles continued to produce reduced track errors relative to operational ensemble forecasts created by the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC). The GFS/EnKF was not uniformly as skillful as the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system. GFS/EnKF track forecasts had slightly higher error than ECMWF at longer leads, especially in the western North Pacific, and exhibited poorer calibration between spread and error than in 2009, perhaps in part because of lower model resolution. Deterministic forecasts from the hybrid were competitive with deterministic EnKF ensemble-mean forecasts and superior in track error to those initialized from the operational variational algorithm, the Gridpoint Statistical Interpolation (GSI). Pending further successful testing, the National Oceanic and Atmospheric Administration (NOAA) intends to implement the global hybrid system operationally for data assimilation.
Sensitivities of the NCEP Global Forecast System
An important issue in developing a forecast system is its sensitivity to additional observations for improving initial conditions, to the data assimilation (DA) method used, and to improvements in the forecast model. These sensitivities are investigated here for the Global Forecast System (GFS) of the National Centers for Environmental Prediction (NCEP). Four parallel sets of 7-day ensemble forecasts were generated for 100 forecast cases in mid-January to mid-March 2016. The sets differed in their 1) inclusion or exclusion of additional observations collected over the eastern Pacific during the El Niño Rapid Response (ENRR) field campaign, 2) use of a hybrid 4D–EnVar versus a pure EnKF DA method to prepare the initial conditions, and 3) inclusion or exclusion of stochastic parameterizations in the forecast model. The Control forecast set used the ENRR observations, hybrid DA, and stochastic parameterizations. Errors of the ensemble-mean forecasts in this Control set were compared with those in the other sets, with emphasis on the upper-tropospheric geopotential heights and vorticity, midtropospheric vertical velocity, column-integrated precipitable water, near-surface air temperature, and surface precipitation. In general, the forecast errors were found to be only slightly sensitive to the additional ENRR observations, more sensitive to the DA methods, and most sensitive to the inclusion of stochastic parameterizations in the model, which reduced errors globally in all the variables considered except geopotential heights in the tropical upper troposphere. The reduction in precipitation errors, determined with respect to two independent observational datasets, was particularly striking.