Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
55
result(s) for
"Jedidi, Kamel"
Sort by:
A megastudy on the predictability of personal information from facial images: Disentangling demographic and non-demographic signals
2023
While prior research has shown that facial images signal personal information, publications in this field tend to assess the predictability of a single variable or a small set of variables at a time, which is problematic. Reported prediction quality is hard to compare and generalize across studies due to different study conditions. Another issue is selection bias: researchers may choose to study variables intuitively expected to be predictable and underreport unpredictable variables (the ‘file drawer’ problem). Policy makers thus have an incomplete picture for a risk-benefit analysis of facial analysis technology. To address these limitations, we perform a megastudy—a survey-based study that reports the predictability of numerous personal attributes (349 binary variables) from 2646 distinct facial images of 969 individuals. Using deep learning, we find 82/349 personal attributes (23%) are predictable better than random from facial image pixels. Adding facial images substantially boosts prediction quality versus demographics-only benchmark model. Our unexpected finding of strong predictability of iPhone versus Galaxy preference variable shows how testing many hypotheses simultaneously can facilitate knowledge discovery. Our proposed L1-regularized image decomposition method and other techniques point to smartphone camera artifacts, BMI, skin properties, and facial hair as top candidate non-demographic signals in facial images.
Journal Article
R2M Index 1.0
2021
Using text-mining, the authors develop version 1.0 of the Relevance to Marketing (R2M) Index, a dynamic index that measures the topical and timely relevance of academic marketing articles to marketing practice. The index assesses topical relevance drawing on a dictionary of marketing terms derived from 50,000 marketing articles published in practitioner outlets from 1982 to 2019. Timely relevance is based on the prevalence of academic marketing topics in practitioner publications at a given time. The authors classify topics into four quadrants based on their low/high popularity in academia and practice —\"Desert,\" \"Academic Island,\" \"Executive Fields,\" and \"Highlands\"—and score academic articles and journals: Journal of Marketing has the highest R2M score, followed by Marketing Science, Journal of Marketing Research, and Journal of Consumer Research. The index correlates with practitioner judgments of practical relevance and other relevance measures. Because the index is a work in progress, the authors discuss how to overcome current limitations and suggest correlating the index with citation counts, altmetrics, and readability measures. Marketing practitioners, authors, and journal editors can use the index to assess article relevance, and academic administrators can use it for promotion and tenure decisions (see www.R2Mindex.com). The R2M Index is thus not only a measurement instrument but also a tool for change.
Journal Article
The past, present, and future of measurement and methods in marketing analysis
by
Hanssens, Dominique M.
,
Lehmann, Donald R.
,
DeSarbo, Wayne S.
in
Brand equity
,
Brand loyalty
,
Brands
2020
The field of marketing has made significant strides over the past 50 years in understanding how methodological choices affect the validity of conclusions drawn from our research. This paper highlights some of these and is organized as follows: We first summarize essential concepts about measurement and the role of cumulating knowledge, then highlight data and analysis methods in terms of their past, present, and future. Lastly, we provide specific examples of the evolution of work on segmentation and brand equity. With relatively well-established methods for measuring constructs, analysis methods have evolved substantially. There have been significant changes in what is seen as the best way to analyze individual studies as well as accumulate knowledge across them via meta-analysis. Collaborations between academia and business can move marketing research forward. These will require the tradeoffs between model prediction and interpretation, and a balance between large-scale use of data and privacy concerns.
Journal Article
A Conjoint Approach for Consumer- and Firm-Level Brand Valuation
2009
This article develops and tests a reduced-form, conjoint methodology for measuring brand equity. The proposed approach (1) provides objective dollar-metric values for brand equity without the need to collect perceptual or brand association data, (2) captures the effects of awareness and availability in the marketplace as sources of brand equity, (3) accounts for competitive reaction, (4) allows the mix of branded and unbranded firms to affect industry size, and (5) uses consideration set theory to project market share estimates from the conjoint experiment to the marketplace. Managers can use the approach to develop customized strategies for targeting customers, monitoring brand \"health,\" allocating resources, and determining the values of brands in a merger or acquisition. The empirical results suggest that the proposed metric for measuring consumer-level brand equity has convergent validity; in addition, the magnitudes and strengths of brand equity vary considerably across consumers and brands. At the firm level, the results show that previous methods are likely to overstate brand equity, especially for products with low market shares. Finally, the results show that the external validity for the proposed brand equity measures is high.
Journal Article
Customer value analysis in a heterogeneous market
by
Sinha, Indrajit
,
DeSarbo, Wayne S.
,
Jedidi, Kamel
in
Brand loyalty
,
Business structures
,
Buyers
2001
In recent years, customer value has become a major focus among strategy researchers and practitioners as an essential element of a firm's competitive strategy. Many firms have been interested in Customer Value Analysis (CVA) which involves a structural analysis of the antecedent factors of perceived value (i.e., perceived quality and perceived price) to assess their relative importance in the perceptions of their buyers. We develop a statistical approach for performing CVA utilizing a recursive simultaneous equation model that is formulated to accommodate buyer heterogeneity. In particular, the proposed finite-mixture methodology allows one to estimate the relative effects and integration rules of perceived value drivers at the market segment level, as well as to simultaneously determine the (unknown) segments themselves. We demonstrate the utility of the proposed methodology via an actual commercial application involving a large electric utility company. Finally, we discuss the contributions of our research from the perspective of firm strategy and how it may be extended in the future.
Journal Article
Augmenting Conjoint Analysis to Estimate Consumer Reservation Price
2002
Consumer reservation price is a key concept in marketing and economics. Theoretically, this concept has been instrumental in studying consumer purchase decisions,competitive pricing strategies,and welfare economics. Managerially,knowledge of consumer reservation prices is critical for implementing many pricing tactics such as bundling,tar get promotions,nonlinear pricing,and one-to-one pricing,and for assessing the impact of marketing strategy on demand. Despite the practical and theoretical importance of this concept, its measurement at the individual level in a practical setting proves elusive.
We propose a conjointbased approach to estimate consumerlevel reservation prices. This approach integrates the preference estimation of traditional conjoint with the economic theory of consumer choice. This integration augments the capability of traditional conjoint such that consumers' reservation prices for a product can be derived directly from the individuallevel estimates of conjoint coefficients. With this augmentation,we can model a consumer's decision of not only which product to buy,but also whether to buy at all in a category. Thus, we can simulate simultaneously three effects that a change in price or the introduction of a new product may generate in a market: the customer switching effect,the cannibalization effect,and the market expansion effect. We show in a pilot application how this approach can aid product and pricing decisions. We also demonstrate the predictive validity of our approach using data from a commercial study of automobile batteries.
Journal Article
Measuring Heterogeneous Reservation Prices for Product Bundles
2003
This paper develops a model for capturing continuous heterogeneity in the joint distribution of reservation prices for products and bundles. Our model is derived from utility theory and captures both within-and among-subject variability. Furthermore, it provides dollarmetric reservation prices and individual-level estimates that allow the firm to target customers and develop customized and nonlinear pricing policies.
Our experiments show that, regardless of whether the products are durables or nondurables, the model captures heterogeneity and predicts well. Models that assume homogeneity perform poorly, especially in predicting choice of the bundle. Furthermore, the methodology is robust even when respondents evaluate few profiles.
Self-stated reservation prices do not have any informational content beyond that contained in the basic model. The direct elicitation method appears to understate (overstate) the variation in reservation prices across consumers for low-priced (high-priced) products and bundles. Hence this method yields biased demand estimates and leads to suboptimal product-line pricing policy.
The optimization results show that the product-line pricing policy depends on the degree of heterogeneity in the reservation prices of the individual products and the bundle. A uniformly high-price strategy for all products and bundles is optimal when heterogeneity is high. Otherwise, a hybrid strategy is optimal.
Journal Article
Relation Between EBA and Nested Logit Models
2017
We show that elimination by aspects (EBA) generalizes nested logit and cross-nested logit models. The latter two models are equivalent to a special case of EBA called preference trees. The transformations between preference trees and nested logit models become more complex when the utilities of alternatives are functions of covariates. In this case, a simple model in one domain corresponds to a complex model in the other. An extended EBA model, in which the utilities of alternatives are functions of covariates, represents a two-stage choice process. Alternatives are first screened using a probabilistic lexicographic rule and then compared in terms of their compensatory utilities. We provide a typology of the relations between EBA and other logit models, and we discuss issues concerning estimation, statistical testing, and data collection. We describe an application illustrating (1) the process of constructing a preference tree with covariates and (2) the different implications obtained from a preference tree and a comparable nested logit model.
Journal Article
Representation and Inference of Lexicographic Preference Models and Their Variants
2007
The authors propose two variants of lexicographic preference rules. They obtain the necessary and sufficient conditions under which a linear utility function represents a standard lexicographic rule, and each of the proposed variants, over a set of discrete attributes. They then: (i) characterize the measurement properties of the parameters in the representations; (ii) propose a nonmetric procedure for inferring each lexicographic rule from pairwise comparisons of multiattribute alternatives; (iii) describe a method for distinguishing among different lexicographic rules, and between lexicographic and linear preference models; and (iv) suggest how individual lexicographic rules can be combined to describe hierarchical market structures. The authors illustrate each of these aspects using data on personal-computer preferences. They find that two-thirds of the subjects in the sample use some kind of lexicographic rule. In contrast, only one in five subjects use a standard lexicographic rule. This suggests that lexicographic rules are more widely used by consumers than one might have thought in the absence of the lexicographic variants described in the paper. The authors report a simulation assessing the ability of the proposed inference procedure to distinguish among alternative lexicographic models, and between linear-compensatory and lexicographic models.
Journal Article
Dynamic Allocation of Pharmaceutical Detailing and Sampling for Long-Term Profitability
by
Montoya, Ricardo
,
Netzer, Oded
,
Jedidi, Kamel
in
Accounting
,
Advertising expenditures
,
Analysis
2010
The U.S. pharmaceutical industry spent upwards of $18 billion on marketing drugs in 2005; detailing and drug sampling activities accounted for the bulk of this spending. To stay competitive, pharmaceutical managers need to maximize the return on these marketing investments by determining
which
physicians to target as well as
when
and
how
to target them.
In this paper, we present a two-stage approach for dynamically allocating detailing and sampling activities across physicians to maximize long-run profitability. In the first stage, we estimate a hierarchical Bayesian, nonhomogeneous hidden Markov model to assess the short- and long-term effects of pharmaceutical marketing activities. The model captures physicians' heterogeneity and dynamics in prescription behavior. In the second stage, we formulate a partially observable Markov decision process that integrates over the posterior distribution of the hidden Markov model parameters to derive a dynamic marketing resource allocation policy across physicians.
We apply the proposed approach in the context of a new drug introduction by a major pharmaceutical firm. We identify three prescription-behavior states, a high degree of physicians' dynamics, and substantial long-term effects for detailing and sampling. We find that detailing is most effective as an acquisition tool, whereas sampling is most effective as a retention tool. The optimization results suggest that the firm could increase its profits substantially while decreasing its marketing spending. Our suggested framework provides important implications for dynamically managing customers and maximizing long-run profitability.
Journal Article