Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
20,710 result(s) for "Calculus of variations."
Sort by:
Theory and Applications of Robust Optimization
In this paper we survey the primary research, both theoretical and applied, in the area of robust optimization (RO). Our focus is on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying prominent theoretical results of RO, we also present some recent results linking RO to adaptable models for multistage decision-making problems. Finally, we highlight applications of RO across a wide spectrum of domains, including finance, statistics, learning, and various areas of engineering.
SparseNet: Coordinate Descent With Nonconvex Penalties
We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. In this article we pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for this approach, study their corresponding threshold functions, and describe a df-standardizing reparametrization that assists our pathwise algorithm. The MC+ penalty is ideally suited to this task, and we use it to demonstrate the performance of our algorithm. Certain technical derivations and experiments related to this article are included in the Supplementary Materials section.
BOUNDS ON ELASTICITIES WITH OPTIMIZATION FRICTIONS: A SYNTHESIS OF MICRO AND MACRO EVIDENCE ON LABOR SUPPLY
How can price elasticities be identified when agents face optimization frictions such as adjustment costs or inattention? I derive bounds on structural price elasticities that are a function of the observed effect of a price change on demand, the size of the price change, and the degree of frictions. The degree of frictions is measured by the utility losses agents tolerate to deviate from the frictionless optimum. The bounds imply that frictions affect intensive margin elasticities much more than extensive margin elasticities. I apply these bounds to the literature on labor supply. The utility costs of ignoring the tax changes used to identify intensive margin labor supply elasticities are typically less than 1% of earnings. As a result, small frictions can explain the differences between micro and macro elasticities, extensive and intensive margin elasticities, and other disparate findings. Pooling estimates from existing studies, I estimate a Hicksian labor supply elasticity of 0.33 on the intensive margin and 0.25 on the extensive margin after accounting for frictions.
Bat algorithm for constrained optimization tasks
In this study, we use a new metaheuristic optimization algorithm, called bat algorithm (BA), to solve constraint optimization tasks. BA is verified using several classical benchmark constraint problems. For further validation, BA is applied to three benchmark constraint engineering problems reported in the specialized literature. The performance of the bat algorithm is compared with various existing algorithms. The optimal solutions obtained by BA are found to be better than the best solutions provided by the existing methods. Finally, the unique search features used in BA are analyzed, and their implications for future research are discussed in detail.
The extremal solution for the fractional Laplacian
We study the extremal solution for the problem ( - Δ ) s u = λ f ( u ) in Ω , u ≡ 0 in R n ∖ Ω , where λ > 0 is a parameter and s ∈ ( 0 , 1 ) . We extend some well known results for the extremal solution when the operator is the Laplacian to this nonlocal case. For general convex nonlinearities we prove that the extremal solution is bounded in dimensions n < 4 s . We also show that, for exponential and power-like nonlinearities, the extremal solution is bounded whenever n < 10 s . In the limit s ↑ 1 , n < 10 is optimal. In addition, we show that the extremal solution is H s ( R n ) in any dimension whenever the domain is convex. To obtain some of these results we need L q estimates for solutions to the linear Dirichlet problem for the fractional Laplacian with L p data. We prove optimal L q and C β estimates, depending on the value of p . These estimates follow from classical embedding results for the Riesz potential in R n . Finally, to prove the H s regularity of the extremal solution we need an L ∞ estimate near the boundary of convex domains, which we obtain via the moving planes method. For it, we use a maximum principle in small domains for integro-differential operators with decreasing kernels.