Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
129
result(s) for
"Ellison, Glenn"
Sort by:
An Economist's Guide to Epidemiology Models of Infectious Disease
We describe the structure and use of epidemiology models of disease transmission, with an emphasis on the susceptible/infected/recovered (SIR) model. We discuss high-profile forecasts of cases and deaths that have been based on these models, what went wrong with the early forecasts, and how they have adapted to the current COVID pandemic. We also offer three distinct areas where economists would be well positioned to contribute to or inform this epidemiology literature: modeling heterogeneity of susceptible populations in various dimensions, accommodating endogeneity of the parameters governing disease spread, and helping to understand the importance of political economy issues in disease suppression.
Journal Article
A search cost model of obfuscation
2012
This article develops models in which obfuscation is individually rational for oligopolistic firms. Firms sell a homogeneous good to rational consumers who incur search costs to learn prices. Search costs are endogenized by allowing obfuscation—firms have an unobservable action that increases the time needed to learn their price. One model involves search costs convex in shopping time. We show that slight convexity can dramatically alter the equilibrium price distribution. A second model examines an informational linkage between current and future search costs: consumers are uncertain about a component of search costs. Here, a signal-jamming mechanism can lead to equilibrium obfuscation.
Journal Article
The Efficiency of Race-Neutral Alternatives to Race-Based Affirmative Action
2021
Several K-12 and university systems have adopted race-neutral affirmative action in place of race-based alternatives. This paper explores whether these plans are effective substitutes for racial quotas in Chicago Public Schools (CPS), which now employs a race-neutral, place-based affirmative action system at its selective exam high schools. The CPS plan is ineffective compared to plans that explicitly consider race: about three-quarters of the reduction in average entrance scores at the top schools could have been avoided with the same level of racial diversity. Moreover, the CPS plan is less effective at adding low-income students than was the previous system of racial quotas. We develop a theoretical framework that motivates quantifying the inefficiency of race-neutral policies based on the distortion in student preparedness they create for a given level of diversity and use it to evaluate several alternatives. The CPS plan can be improved in several ways, but no race-neutral policy restores minority representation to prior levels without substantially greater distortions, implying significant efficiency costs from prohibitions on the explicit use of race.
Journal Article
What Causes Industry Agglomeration? Evidence from Coagglomeration Patterns
by
Glaeser, Edward L.
,
Kerr, William R.
,
Ellison, Glenn
in
1972-1997
,
Agglomeration
,
Approximation
2010
Why do firms cluster near one another? We test Marshall's theories of industrial agglomeration by examining which industries locate near one another, or coagglomerate. We construct pairwise coagglomeration indices for US manufacturing industries from the Economic Census. We then relate coagglomeration levels to the degree to which industry pairs share goods, labor, or ideas. To reduce reverse causality, where collocation drives input-output linkages or hiring patterns, we use data from UK industries and from US areas where the two industries are not collocated. All three of Marshall's theories of agglomeration are supported, with input-output linkages particularly important.
Journal Article
The gender gap in secondary school mathematics at high achievement levels
2010
This paper uses a new data source, American Mathematics Competitions, to examine the gender gap among high school students at very high achievement levels. The data bring out several new facts. There is a large gender gap that widens dramatically at percentiles above those that can be examined using standard data sources. An analysis of unobserved heterogeneity indicates that there is only moderate variation in the gender gap across schools. The highest achieving girls in the U.S. are concentrated in a very small set of elite schools, suggesting that almost all girls with the ability to reach high math achievement levels are not doing so.
Journal Article
The Slowdown of the Economics Publishing Process
2002
Over the last three decades there has been a dramatic slowdown of the publication process at top economics journals. A substantial part is due to journals' requiring more extensive revisions. Various explanations are considered: democratization of the review process, increases in the complexity of papers, growth of the profession, and cost and benefit arguments. Changes in the profession are examined using time‐series data. Connections between these changes and the slowdown are examined using paper‐level data. There is evidence for some explanations, but most of the slowdown remains unexplained. Changes may reflect evolving social norms.
Journal Article
Strategic Entry Deterrence and the Behavior of Pharmaceutical Incumbents Prior to Patent Expiration
2011
This paper develops a new approach to testing for strategic entry deterrence and applies it to the behavior of pharmaceutical incumbents before patent expiration. It examines a cross section of markets, determining whether behavior is nonmonotonic in market size. Under some conditions, investment levels will be monotone in market size if firms do not invest to deter entry. Strategic investments to deter entry, however, may result in nonmonotonic investment because they are unnecessary in small markets, and impossible in large ones. Consistent with an entry-deterrence motivation is the finding that incumbents in medium-sized markets advertise less prior to patent expiration.
Journal Article
Evolving Standards for Academic Publishing: A q-r Theory
2002
This paper develops models of quality standards to examine two trends: academic
journals increasingly require extensive revisions of submissions, and articles
are becoming longer and changing in other ways. Papers are modeled as varying
along two quality dimensions: q reflects the importance of the
main ideas and r other aspects of quality. Observed trends are
regarded as increases in r-quality. A static equilibrium
model illustrates comparative statics explanations. A dynamic model in which
referees (with a biased view of their own work) learn social norms for weighting
q and r is shown to produce a long,
gradual evolution of social norms.
Journal Article
Implications of heterogeneous SIR models for analyses of COVID-19
2024
This paper provides a quick survey of results on the classic SIR model and variants allowing for heterogeneity in contact rates. It notes that calibrating the classic model to data generated by a heterogeneous model can lead to forecasts that are biased in several ways and to understatement of the forecast uncertainty. Among the biases are that we may underestimate how quickly herd immunity might be reached, underestimate differences across regions, and have biased estimates of the impact of endogenous and policy-driven social distancing.
Journal Article
Basins of Attraction, Long-Run Stochastic Stability, and the Speed of Step-by-Step Evolution
2000
The paper examines the behaviour of “evolutionary” models with ɛ-noise like those which have been used recently to discuss the evolution of social conventions. The paper is built around two main observations: that the “long run stochastic stability” of a convention is related to the speed with which evolution toward and away from the convention occurs, and that evolution is more rapid (and hence more powerful) when it may proceed via a series of small steps between intermediate steady states. The formal analysis uses two new measures, the radius and modified coradius, to characterize the long run stochastically stable set of an evolutionary model and to bound the speed with which evolutionary change occurs. Though not universally powerful, the result can be used to make many previous analyses more transparent and extends them by providing results on waiting times. A number of applications are also discussed. The selection of the risk dominant equilibrium in 2 × 2 games is generalized to the selection of ½-dominant equilibria in arbitrary games. Other applications involve two-dimensional local interaction and cycles as long run stochastically stable sets.
Journal Article