Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
8,605 result(s) for "Cost curve"
Sort by:
Predicting Landslide Susceptibility Using Cost Function in Low-Relief Areas: A Case Study of the Urban Municipality of Attecoube (Abidjan, Ivory Coast)
Landslides are among the most hazardous natural phenomena affecting Greater Abidjan, causing significant economic and social damage. Strategic planning supported by geographic information systems (GIS) can help mitigate potential losses and enhance disaster resilience. This study evaluates landslide susceptibility using logistic regression and frequency ratio models. The analysis is based on a dataset comprising 54 mapped landslide scarps collected from June 2015 to July 2023, along with 16 thematic predictor variables, including altitude, slope, aspect, profile curvature, plan curvature, drainage area, distance to the drainage network, normalized difference vegetation index (NDVI), and an urban-related layer. A high-resolution (5-m) digital elevation model (DEM), derived from multiple data sources, supports the spatial analysis. The landslide inventory was randomly divided into two subsets: 80% for model calibration and 20% for validation. After optimization and statistical testing, the selected thematic layers were integrated to produce a susceptibility map. The results indicate that 6.3% (0.7 km2) of the study area is classified as very highly susceptible. The proportion of the sample (61.2%) in this class had a frequency ratio estimated to be 20.2. Among the predictive indicators, altitude, slope, SE, S, NW, and NDVI were found to have a positive impact on landslide occurrence. Model performance was assessed using the area under the receiver operating characteristic curve (AUC), demonstrating strong predictive capability. These findings can support informed land-use planning and risk reduction strategies in urban areas. Furthermore, the prediction model should be communicated to and understood by local authorities to facilitate disaster management. The cost function was adopted as a novel approach to delineate hazardous zones. Considering the landslide inventory period, the increasing hazard due to climate change, and the intensification of human activities, a reasoned choice of sample size was made. This informed decision enabled the production of an updated prediction map. Optimal thresholds were then derived to classify areas into high- and low-susceptibility categories. The prediction map will be useful to planners in helping them make decisions and implement protective measures.
Applications of Marginal Abatement Cost Curve (MACC) for Reducing Greenhouse Gas Emissions: A Review of Methodologies
A wide range of Marginal Abatement Cost Curve (MACC) methods for reducing greenhouse gas (GHG) emissions has been introduced in various academic literature in the last decade to address various issues, to use different calculable logic, producing different results and implications. A detailed review has not been carried out on the application of MACC in terms of types of emissions, country/sector, and methodology used. This study is aimed at identifying, interpreting, and clarifying currently available literature on MACCs development from 2010-2020 by reviewing the previous applicability of three analytic dimensions including Greenhouse Gas (GHG) emission type, research objects, and modeling methodologies from top-down and bottom-up methods, providing researchers with information of past developments and future trends in this area. The result shows that CO2 is one of the most studied GHG emissions in calculating marginal abatement costs and some countries/regions have not received much attention from researchers in assessing emission reductions. Finally, the MACC bottom-up methodology focuses on the application of the engineering model method and the distance function method is a favorite in the application of the top-down method. Furthermore, this study also highlights possible research opportunities, which may lead to more successful and impactful results in future MACC studies.
Assessment of energy saving potential and CO2 abatement cost curve in 2030 for steel industry in Thailand
The master plan of energy management for Thailand iron and steel industry has been proposed by Iron and Steel Institute of Thailand (ISIT). Three plausible scenarios in the master plan were S1: without integrated steel plant (baseline scenario), S2: with a traditional integrated BF–BOF and S3: with an alternative integrated DR-EAF. This study investigated the potential of energy reduction and CO 2 emission reduction in 2030 under two reduction target scenarios which were scenario A: to achieve ISIT'S plan and scenario B: maximum energy reduction. Moreover, the CO 2 abatement cost curve and the sensitivity analysis of the abatement cost with different interest rates were studied. By following the baseline scenario (S1), the potential of energy reduction and CO 2 reduction was 12.74 million GJ and 1.28 million tCO eq . The traditional integrated BF–BOF route (S2) exhibited the highest energy saving and CO 2 reduction potential, followed by S3 (DR-EAF) and S1 (baseline). The maximum energy reduction and CO 2 reduction could be increased 11.8% and 17.9% from the ISIT’s plan. The sensitivity analysis indicated that the change of interest rates (3.27, 4.27 and 5.27%) affected the abatement cost ranged from − 21 to + 24% when compared with the long-term interest rate of 4.27%.
Estimating Welfare in Insurance Markets Using Variation in Prices
We provide a graphical illustration of how standard consumer and producer theory can be used to quantify the welfare loss associated with inefficient pricing in insurance markets with selection. We then show how this welfare loss can be estimated empirically using identifying variation in the price of insurance. Such variation, together with quantity data, allows us to estimate the demand for insurance. The same variation, together with cost data, allows us to estimate how insurers' costs vary as market participants endogenously respond to price. The slope of this estimated cost curve provides a direct test for both the existence and the nature of selection, and the combination of demand and cost curves can be used to estimate welfare.We illustrate our approach by applying it to data on employer-provided health insurance from one specific company. We detect adverse selection but estimate that the quantitative welfare implications associated with inefficient pricing in our particular application are small, in both absolute and relative terms.
Monopsony in Motion
What happens if an employer cuts wages by one cent? Much of labor economics is built on the assumption that all the workers will quit immediately. Here, Alan Manning mounts a systematic challenge to the standard model of perfect competition.Monopsony in Motionstands apart by analyzing labor markets from the real-world perspective that employers have significant market (or monopsony) power over their workers. Arguing that this power derives from frictions in the labor market that make it time-consuming and costly for workers to change jobs, Manning re-examines much of labor economics based on this alternative and equally plausible assumption. The book addresses the theoretical implications of monopsony and presents a wealth of empirical evidence. Our understanding of the distribution of wages, unemployment, and human capital can all be improved by recognizing that employers have some monopsony power over their workers. Also considered are policy issues including the minimum wage, equal pay legislation, and caps on working hours. In a monopsonistic labor market, concludes Manning, the \"free\" market can no longer be sustained as an ideal and labor economists need to be more open-minded in their evaluation of labor market policies.Monopsony in Motionwill represent for some a new fundamental text in the advanced study of labor economics, and for others, an invaluable alternative perspective that henceforth must be taken into account in any serious consideration of the subject.
The free market innovation machine
Why has capitalism produced economic growth that so vastly dwarfs the growth record of other economic systems, past and present? Why have living standards in countries from America to Germany to Japan risen exponentially over the past century? William Baumol rejects the conventional view that capitalism benefits society through price competition--that is, products and services become less costly as firms vie for consumers. Where most others have seen this as the driving force behind growth, he sees something different--a compound of systematic innovation activity within the firm, an arms race in which no firm in an innovating industry dares to fall behind the others in new products and processes, and inter-firm collaboration in the creation and use of innovations. While giving price competition due credit, Baumol stresses that large firms use innovation as a prime competitive weapon. However, as he explains it, firms do not wish to risk too much innovation, because it is costly, and can be made obsolete by rival innovation. So firms have split the difference through the sale of technology licenses and participation in technology-sharing compacts that pay huge dividends to the economy as a whole--and thereby made innovation a routine feature of economic life. This process, in Baumol's view, accounts for the unparalleled growth of modern capitalist economies. Drawing on extensive research and years of consulting work for many large global firms, Baumol shows in this original work that the capitalist growth process, at least in societies where the rule of law prevails, comes far closer to the requirements of economic efficiency than is typically understood. Resounding with rare intellectual force, this book marks a milestone in the comprehension of the accomplishments of our free-market economic system--a new understanding that, suggests the author, promises to benefit many countries that lack the advantages of this immense innovation machine.
Technical potentials and costs for reducing global anthropogenic methane emissions in the 2050 timeframe -results from the GAINS model
Methane is the second most important greenhouse gas after carbon dioxide contributing to human-made global warming. Keeping to the Paris Agreement of staying well below two degrees warming will require a concerted effort to curb methane emissions in addition to necessary decarbonization of the energy systems. The fastest way to achieve emission reductions in the 2050 timeframe is likely through implementation of various technical options. The focus of this study is to explore the technical abatement and cost pathways for reducing global methane emissions, breaking reductions down to regional and sector levels using the most recent version of IIASA's Greenhouse gas and Air pollution Interactions and Synergies (GAINS) model. The diverse human activities that contribute to methane emissions make detailed information on potential global impacts of actions at the regional and sectoral levels particularly valuable for policy-makers. With a global annual inventory for 1990-2015 as starting point for projections, we produce a baseline emission scenario to 2050 against which future technical abatement potentials and costs are assessed at a country and sector/technology level. We find it technically feasible in year 2050 to remove 54 percent of global methane emissions below baseline, however, due to locked in capital in the short run, the cumulative removal potential over the period 2020-2050 is estimated at 38 percent below baseline. This leaves 7.7 Pg methane released globally between today and 2050 that will likely be difficult to remove through technical solutions. There are extensive technical opportunities at low costs to control emissions from waste and wastewater handling and from fossil fuel production and use. A considerably more limited technical abatement potential is found for agricultural emissions, in particular from extensive livestock rearing in developing countries. This calls for widespread implementation in the 2050 timeframe of institutional and behavioural options in addition to technical solutions.
The Cost of Debt
We use exogenous variation in tax benefit functions to estimate firm-specific cost of debt functions that are conditional on company characteristics such as collateral, size, and book-to-market. By integrating the area between the benefit and cost functions, we estimate that the equilibrium net benefit of debt is 3.5% of asset value, resulting from an estimated gross benefit (cost) of debt equal to 10.4% (6.9%) of asset value. We find that the cost of being overlevered is asymmetrically higher than the cost of being underlevered and that expected default costs constitute only half of the total ex ante costs of debt.
Greenhouse gas implications of mobilizing agricultural biomass for energy: a reassessment of global potentials in 2050 under different food-system pathways
Global bioenergy potentials have been the subject of extensive research and continued controversy. Due to vast uncertainties regarding future yields, diets and other influencing parameters, estimates of future agricultural biomass potentials vary widely. Most scenarios compatible with ambitious climate targets foresee a large expansion of bioenergy, mainly from energy crops that needs to be kept consistent with projections of agriculture and food production. Using the global biomass balance model BioBaM, we here present an assessment of agricultural bioenergy potentials compatible with the Food and Agriculture Organization's (2018) 'Alternative pathways to 2050' projections. Mobilizing biomass at larger scales may be associated with systemic feedbacks causing greenhouse gas (GHG) emissions, e.g. crop residue removal resulting in loss of soil carbon stocks and increased emissions from fertilization. To assess these effects, we derive 'GHG cost supply-curves', i.e. integrated representations of biomass potentials and their systemic GHG costs. Livestock manure is most favourable in terms of GHG costs, as anaerobic digestion yields reductions of GHG emissions from manure management. Global potentials from intensive livestock systems are about 5 EJ/yr. Crop residues can provide up to 20 EJ/yr at moderate GHG costs. For energy crops, we find that the medium range of literature estimates (∼40 to 90 EJ/yr) is only compatible with FAO yield and human diet projections if energy plantations expand into grazing areas (∼4-5 million km2) and grazing land is intensified globally. Direct carbon stock changes associated with perennial energy crops are beneficial for climate mitigation, yet there are-sometimes considerable-'opportunity GHG costs' if one accounts the foregone opportunity of afforestation. Our results indicate that the large potentials of energy crops foreseen in many energy scenarios are not freely and unconditionally available. Disregarding systemic effects in agriculture can result in misjudgement of GHG saving potentials and flawed climate mitigation strategies.
Supermodularity and complementarity (Frontiers of economic research)
The economics literature is replete with examples of monotone comparative statics; that is, scenarios where optimal decisions or equilibria in a parameterized collection of models vary monotonically with the parameter. Most of these examples are manifestations of complementarity, with a common explicit or implicit theoretical basis in properties of a super-modular function on a lattice. Supermodular functions yield a characterization for complementarity and extend the notion of complementarity to a general setting that is a natural mathematical context for studying complementarity and monotone comparative statics. Concepts and results related to supermodularity and monotone comparative statics constitute a new and important formal step in the long line of economics literature on complementarity. This monograph links complementarity to powerful concepts and results involving supermodular functions on lattices and focuses on analyses and issues related to monotone comparative statics. Don Topkis, who is known for his seminal contributions to this area, here presents a self-contained and up-to-date view of this field, including many new results, to scholars interested in economic theory and its applications as well as to those in related disciplines. The emphasis is on methodology. The book systematically develops a comprehensive, integrated theory pertaining to supermodularity, complementarity, and monotone comparative statics. It then applies that theory in the analysis of many diverse economic models formulated as decision problems, noncooperative games, and cooperative games.