Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
200
result(s) for
"hierarchical priors"
Sort by:
Direction-of-Arrival Estimation via Sparse Bayesian Learning Exploiting Hierarchical Priors with Low Complexity
2024
For direction-of-arrival (DOA) estimation problems in a sparse domain, sparse Bayesian learning (SBL) is highly favored by researchers owing to its excellent estimation performance. However, traditional SBL-based methods always assign Gaussian priors to parameters to be solved, leading to moderate sparse signal recovery (SSR) effects. The reason is Gaussian priors play a similar role to l2 regularization in sparsity constraint. Therefore, numerous methods are developed by adopting hierarchical priors that are used to perform better than Gaussian priors. However, these methods are in straitened circumstances when multiple measurement vector (MMV) data are adopted. On this basis, a block-sparse SBL method (named BSBL) is developed to handle DOA estimation problems in MMV models. The novelty of BSBL is the combination of hierarchical priors and block-sparse model originating from MMV data. Therefore, on the one hand, BSBL transfers the MMV model to a block-sparse model by vectorization so that Bayesian learning is directly performed, regardless of the prior independent assumption of different measurement vectors and the inconvenience caused by the solution of matrix form. On the other hand, BSBL inherited the advantage of hierarchical priors for better SSR ability. Despite the benefit, BSBL still has the disadvantage of relatively large computation complexity caused by high dimensional matrix operations. In view of this, two operations are implemented for low complexity. One is reducing the matrix dimension of BSBL by approximation, generating a method named BSBL-APPR, and the other is embedding the generalized approximate message passing (GAMB) technique into BSBL so as to decompose matrix operations into vector or scale operations, named BSBL-GAMP. Moreover, BSBL is able to suppress temporal correlation and handle wideband sources easily. Extensive simulation results are presented to prove the superiority of BSBL over other state-of-the-art algorithms.
Journal Article
HBMIRT: A SAS macro for estimating uni- and multidimensional 1- and 2-parameter item response models in small (and large!) samples
by
Zitzmann, Steffen
,
Hecht, Martin
,
Wagner, Wolfgang
in
Bayes Theorem
,
Behavioral Science and Psychology
,
Cognitive Psychology
2024
Item response theory (IRT) has evolved as a standard psychometric approach in recent years, in particular for test construction based on dichotomous (i.e., true/false) items. Unfortunately, large samples are typically needed for item refinement in unidimensional models and even more so in the multidimensional case. However, Bayesian IRT approaches with hierarchical priors have recently been shown to be promising for estimating even complex models in small samples. Still, it may be challenging for applied researchers to set up such IRT models in general purpose or specialized statistical computer programs. Therefore, we developed a user-friendly tool – a SAS macro called HBMIRT – that allows to estimate uni- and multidimensional IRT models with dichotomous items. We explain the capabilities and features of the macro and demonstrate the particular advantages of the implemented hierarchical priors in rather small samples over weakly informative priors and traditional maximum likelihood estimation with the help of a simulation study. The macro can also be used with the online version of SAS OnDemand for Academics that is freely accessible for academic researchers.
Journal Article
Efficient Sparse Bayesian Learning Model for Image Reconstruction Based on Laplacian Hierarchical Priors and GAMP
2024
In this paper, we present a novel sparse Bayesian learning (SBL) method for image reconstruction. We integrate the generalized approximate message passing (GAMP) algorithm and Laplacian hierarchical priors (LHP) into a basic SBL model (called LHP-GAMP-SBL) to improve the reconstruction efficiency. In our SBL model, the GAMP structure is used to estimate the mean and variance without matrix inversion in the E-step, while LHP is used to update the hyperparameters in the M-step.The combination of these two structures further deepens the hierarchical structures of the model. The representation ability of the model is enhanced so that the reconstruction accuracy can be improved. Moreover, the introduction of LHP accelerates the convergence of GAMP, which shortens the reconstruction time of the model. Experimental results verify the effectiveness of our method.
Journal Article
Mixture autoregressive and spectral attention network for multispectral image compression based on variational autoencoder
2024
Multispectral images, with their unique three-dimensional characteristics, require specialized spatial-spectral feature extraction modules to achieve superior compression results. Current end-to-end compression frameworks underperform compared to advanced coding algorithms, primarily due to insufficient spectral feature extraction at high bit rates and challenges in guiding entropy coding. To address these issues, this paper proposes the Mixture Autoregressive Spectral Attention Network (MARSA-Net), featuring two attention mechanisms: Coor-Spec and LD-CAM, and an autoregressive component. Our evaluation on real datasets from satellites demonstrates MARSA-Net’s superiority over traditional algorithms, including H.266/VVC, underlining its potential in multispectral image compression. This research contributes to improved compression methods and extends our understanding of spectral feature extraction in multispectral imagery.
Journal Article
South African inflation response to fiscal policy shocks
by
Buthelezi, Eugene Msizi
in
Bayesian analysis
,
Bayesian vector autoregressions
,
Central government
2024
The research employs Bayesian Vector Autoregressions with hierarchical priors to analyze the intricate economic implications of fiscal policy shocks on inflation, monetary policy, and fiscal authorities in the context of South Africa. The study explores data spanning from 1979 to 2022. Contrary to conventional economic theories, our analysis demonstrates that unexpected increases in national government expenditure led to counterintuitive initial decreases in inflation. This highlights the complexity of inflation dynamics and challenges existing paradigms. Moreover, the lagged response of inflation to changes in government revenue emphasizes the role of inflation expectations and market dynamics. Clear communication by fiscal authorities is crucial for shaping these expectations and understanding their impact on inflation.
Journal Article
Do non-performing loans matter for bank lending and the business cycle in Euro area countries?
by
Pancaro, Cosimo
,
Martin, Reiner
,
Huljak, Ivan
in
Banking
,
Banking industry
,
Bayesian analysis
2022
We estimate the impact of changes in non-performing loan (NPL) ratios on aggregate banking sector variables and the macroeconomy by estimating a panel Bayesian VAR model for twelve euro area countries. The main findings are as follows: i) An impulse response analysis shows that an exogenous increase in the change in NPL ratios tends to depress bank lending volumes, widens bank lending spreads and leads to a fall in real GDP growth and residential real estate prices; ii) A forecast error variance decomposition shows that shocks to the change in NPL ratios explain a relatively large share of the variance of the variables in the VAR, particularly for countries that experienced a large increase in NPL ratios during the recent crises; and iii) A three-year structural out-of-sample scenario analysis suggests that reducing banks' NPL ratios can produce significant benefits in terms of improved macroeconomic and financial conditions.
Journal Article
Shrinkage priors for high-dimensional demand estimation
2023
Estimating demand for large assortments of differentiated goods requires the specification of a demand system that is sufficiently flexible. However, flexible models are highly parameterized so estimation requires appropriate forms of regularization to avoid overfitting. In this paper, we study the specification of Bayesian shrinkage priors for pairwise product substitution parameters. We use a log-linear demand system as a leading example. Log-linear models are parameterized by own and cross-price elasticities, and the total number of elasticities grows quadratically in the number of goods. Traditional regularized estimators shrink regression coefficients towards zero which can be at odds with many economic properties of price effects. We propose a hierarchical extension of the class of global-local priors commonly used in regression modeling to allow the direction and rate of shrinkage to depend on a product classification tree. We use both simulated data and retail scanner data to show that, in the absence of a strong signal in the data, estimates of price elasticities and demand predictions can be improved by imposing shrinkage to higher-level group elasticities rather than zero.
Journal Article
A scrutiny into fiscal policy in the South African economy: A Bayesian approach with hierarchical priors
by
Greyling, Lorraine
,
Zungu, Lindokuhle Talent
,
Makhoba, Bongumusa P
in
Bayesian analysis
,
BVAR
,
Capital formation
2022
This study analyses the impact of fiscal policy on the South African economy during the period 1972Q1-2020Q2. The study adopted quarterly time series data to estimate a Bayesian Vector Autoregression (BVAR) model with the selection of hierarchical priors. The variables employed for empirical investigation included GDP, government expenditure, public debt, and gross fixed-capital formation. The results of the study show that an unexpected shock in government expenditure and public debt has a significant negative and persistent impact on economic growth in South Africa, while an unexpected shock in investment has a significant positive effect on economic growth. The findings suggest that escalating public expenditure and public debt lead to economic contraction. This implies that policy-makers ought to be cautious of excessive government expenditure and public debt to achieve fiscal consolidation. Policy-makers ought to focus on addressing structural challenges through the implementation of sound structural reform policies that aim to attract investment consistent with job creation, development and growth in South Africa's economy.
Journal Article
The effects of economic policy uncertainty on European economies: evidence from a TVP-FAVAR
2020
We use a time-varying parameter FAVAR model to investigate the effects of economic policy uncertainty (EPU) on a wide range of macroeconomic variables for eleven European Monetary Union (EMU) countries. First, we are able to distinguish between a group of fragile countries (GIIPS countries) and a group of stable countries (northern countries), where the former suffered the most due to EPU shocks. Second, we find that EPU shocks affect financial markets as well as the real economy and that private investors and financial market participants react more sensitively than consumers to EPU shocks. Third, we discover that the transmission of EPU shocks is quite stable over time.
Journal Article
How to generalize from a hierarchical model?
2020
Models of consumer heterogeneity play a pivotal role in marketing and economics, specifically in random coefficient or mixed logit models for aggregate or individual data and in hierarchical Bayesian models of heterogeneity. In applications, the inferential target often pertains to a population beyond the sample of consumers providing the data. For example, optimal prices inferred from the model are expected to be optimal in the population and not just optimal in the observed, finite sample. The population model, random coefficients distribution, or heterogeneity distribution is the natural and correct basis for generalizations from the observed sample to the market. However, in many if not most applications standard heterogeneity models such as the multivariate normal, or its finite mixture generalization lack economic rationality because they support regions of the parameter space that contradict basic economic arguments. For example, such population distributions support positive price coefficients or preferences against fuel-efficiency in cars. Likely as a consequence, it is common practice in applied research to rely on the collection of individual level mean estimates of consumers as a representation of population preferences that often substantially reduce the support for parameters in violation of economic expectations. To overcome the choice between relying on a mis-specified heterogeneity distribution and the collection of individual level means that fail to measure heterogeneity consistently, we develop an approach that facilitates the formulation of more economically faithful heterogeneity distributions based on prior constraints. In the common situation where the heterogeneity distribution comprises both constrained and unconstrained coefficients (e.g., brand and price coefficients), the choice of subjective prior parameters is an unresolved challenge. As a solution to this problem, we propose a marginal-conditional decomposition that avoids the conflict between wanting to be more informative about constrained parameters and only weakly informative about unconstrained parameters. We show how to efficiently sample from the implied posterior and illustrate the merits of our prior as well as the drawbacks of relying on means of individual level preferences for decision-making in two illustrative case studies.
Journal Article