Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
420
result(s) for
"Directional derivative"
Sort by:
The Directional Derivative in General Quantum Calculus
by
Karim, Avin O.
,
Shehata, Enas M.
,
Cardoso, José Luis
in
Calculus
,
Continuity (mathematics)
,
Derivatives
2022
In this paper, we define the β-partial derivative as well as the β-directional derivative of a multi-variable function based on the β-difference operator, Dβ, which is defined by Dβf(t)=f(β(t))−f(t)/β(t)−t, where β is a strictly increasing continuous function. Some properties are proved. Furthermore, the β-gradient vector and the β-gradient directional derivative of a multi-variable function are introduced. Finally, we deduce the Hahn-partial and the Hahn-directional derivatives associated with the Hahn difference operator.
Journal Article
Optimality and Complexity for Constrained Optimization Problems with Nonconvex Regularization
by
Chen, Xiaojun
,
Bian, Wei
in
Chaos theory
,
constrained nonsmooth nonconvex optimization
,
Continuity (mathematics)
2017
In this paper, we consider a class of constrained optimization problems where the feasible set is a general closed convex set, and the objective function has a nonsmooth, nonconvex regularizer. Such a regularizer includes widely used SCAD, MCP, logistic, fraction, hard thresholding, and non-Lipschitz
L
p
penalties as special cases. Using the theory of the generalized directional derivative and the tangent cone, we derive a first order necessary optimality condition for local minimizers of the problem, and define the generalized stationary point of it. We show that the generalized stationary point is the Clarke stationary point when the objective function is Lipschitz continuous at this point, and satisfies the existing necessary optimality conditions when the objective function is not Lipschitz continuous at this point. Moreover, we prove the consistency between the generalized directional derivative and the limit of the classic directional derivatives associated with the smoothing function. Finally, we establish a lower bound property for every local minimizer and show that finding a global minimizer is strongly NP-hard when the objective function has a concave regularizer.
Journal Article
Higher-order conditions for strict local Pareto minima for problems with partial order introduced by a polyhedral cone
2018
Using the definitions of μ-th order lower and upper directional derivatives of vector-valued functions, introduced in Rahmo and Studniarski (J. Math. Anal. Appl. 393 (2012), 212–221), we provide some necessary and sufficient conditions for strict local Pareto minimizers of order μ for optimization problems where the partial order is introduced by a pointed polyhedral cone with non-empty interior.
Journal Article
Corner Detection Using Multi-directional Structure Tensor with Multiple Scales
2020
Corners are important features for image analysis and computer vision tasks. Local structure tensors with multiple scales are widely used in intensity-based corner detectors. In this paper, the properties of intensity variations of a step edge, L-type corner, Y- or T-type corner, X-type corner, and star-type corner are investigated. The properties that we obtained indicate that the image intensity variations of a corner are not always large in all directions. The properties also demonstrate that existing structure tensor-based corner detection methods cannot depict the differences of intensity variations well between edges and corners which result in wrong corner detections. We present a new technique to extract the intensity variations from input images using anisotropic Gaussian directional derivative filters with multiple scales. We prove that the new extraction technique on image intensity variation has the ability to accurately depict the characteristics of edges and corners in the continuous domain. Furthermore, the properties of the intensity variations of step edges and corners enable us to derive a new multi-directional structure tensor with multiple scales, which has the ability to depict the intensity variation differences well between edges and corners in the discrete domain. The eigenvalues of the multi-directional structure tensor with multiple scales are used to develop a new corner detection method. Finally, the criteria on average repeatability (under affine image transformation, JPEG compression, and noise degradation), region repeatability based on the Oxford dataset, repeatability metric based on the DTU dataset, detection accuracy, and localization accuracy are used to evaluate the proposed detector against ten state-of-the-art methods. The experimental results show that our proposed detector outperforms all the other tested detectors.
Journal Article
OPTIMALITY CONDITIONS FOR EFFICIENCY ON NONSMOOTH MULTIOBJECTIVE PROGRAMMING PROBLEMS
by
Long, Xian-Jun
,
Huang, Nan-Jing
in
Directional derivatives
,
Mathematical functions
,
Necessary conditions for optimality
2014
In this paper, a nonsmooth multiobjective programming problem is introduced and studied. By using the generalized Guignard constraint qualification, some stronger Kuhn-Tucker type necessary optimality conditions for efficiency in terms of convexificators are established, in which we are not assuming that the objective functions are directionally differentiable. Moreover, some conditions which ensure that a feasible solution is an efficient solution to nonsmooth multiobjective programming problems are also given. The results presented in this paper improve the corresponding results in the literature.
2010Mathematics Subject Classification: 90C29, 90C46, 49J52.
Key words and phrases: Optimality condition, Nonsmooth multiobjective programming, Efficient solution, Dini directional derivatives, Convexificators, Constraint qualification.
Journal Article
Gateaux semiderivative approach applied to shape optimization of obstacle problems
2024
Shape optimization problems constrained by variational inequalities (VI) are non-smooth and non-convex optimization problems. The non-smoothness arises due to the variational inequality constraint, which makes it challenging to derive optimality conditions. Besides the non-smoothness there are complementary aspects due to the VIs, as well as distributed, non-linear, non-convex and infinite-dimensional aspects, due to the shapes, which complicate setting up an optimality system and, thus, developing efficient solution algorithms. In this paper, we consider Gateaux semiderivatives for the purpose of formulating optimality conditions. In the application, we concentrate on a shape optimization problem constrained by the obstacle problem.
Journal Article
Sensitivity Analysis of Traffic Equilibria
2004
The contribution of the paper is a complete analysis of the sensitivity of elastic demand traffic (Wardrop) equilibria. The existence of a directional derivative of the equilibrium solution (link flow, least travel cost, demand) in any direction is given a characterization, and the same is done for its gradient. The gradient, if it exists, is further interpreted as a limiting case of the gradient of the logit-based SUE solution, as the dispersion parameter tends to infinity. In the absence of the gradient, we show how to compute a subgradient. All these computations (directional derivative, (sub)gradient) are performed by solving similar traffic equilibrium problems with affine link cost and demand functions, and they can be performed by the same tool as (or one similar to) the one used for the original traffic equilibrium model; this fact is of clear advantage when applying sensitivity analysis within a bilevel (or mathematical program with equilibrium constraints, MPEC) application, such as for congestion pricing, OD estimation, or network design. A small example illustrates the possible nonexistence of a gradient and the computation of a subgradient.
Journal Article
Mathematical connections promoted in multivariable calculus’ classes and in problems-solving about vectors, partial and directional derivatives, and applications
2025
In a vector calculus course, the mathematical connections made by an in-service teacher and his engineering students in problems-solving involving vectors and partial and directional derivatives were explored. This study is relevant due to the difficulties in connecting multiple representations and meanings of ordinary and partial derivatives. Networking between the extended theory of connections and the onto-semiotic approach was used. The qualitative methodology included three stages: (1) selection of participants (in-service teacher and students), (2) data collection in four moments: design of the class on partial and directional derivatives and then, the development of this applying participant-observation and recording, design of a questionnaire and its application to the students, and (3) data analysis using theoretical tools. The results showed that the in-service teacher used various connections, starting with the instructional oriented and then others such as meaning, procedural and representations. Students defined and represented vector, partial and directional derivatives concepts, activating meaning connections and different representations. Also, they solved tasks using different connections (different representations, procedural, feature) to find partial and directional derivatives, gradient, curl and divergence. This analysis was carried out in terms of mathematical practices, processes, objects and semiotic functions. 72% of the students gave meaning, represented and appropriately used the concepts of vector calculus, while 28% had difficulties, especially in the procedural connection to find partial derivatives.
Journal Article
Discrete Approximations of Gaussian Smoothing and Gaussian Derivatives
2024
This paper develops an in-depth treatment concerning the problem of approximating the Gaussian smoothing and the Gaussian derivative computations in scale-space theory for application on discrete data. With close connections to previous axiomatic treatments of continuous and discrete scale-space theory, we consider three main ways of discretizing these scale-space operations in terms of explicit discrete convolutions, based on either (i) sampling the Gaussian kernels and the Gaussian derivative kernels, (ii) locally integrating the Gaussian kernels and the Gaussian derivative kernels over each pixel support region, to aim at suppressing some of the severe artefacts of sampled Gaussian kernels and sampled Gaussian derivatives at very fine scales, or (iii) basing the scale-space analysis on the discrete analogue of the Gaussian kernel, and then computing derivative approximations by applying small-support central difference operators to the spatially smoothed image data.
We study the properties of these three main discretization methods both theoretically and experimentally and characterize their performance by quantitative measures, including the results they give rise to with respect to the task of scale selection, investigated for four different use cases, and with emphasis on the behaviour at fine scales. The results show that the sampled Gaussian kernels and the sampled Gaussian derivatives as well as the integrated Gaussian kernels and the integrated Gaussian derivatives perform very poorly at very fine scales. At very fine scales, the discrete analogue of the Gaussian kernel with its corresponding discrete derivative approximations performs substantially better. The sampled Gaussian kernel and the sampled Gaussian derivatives do, on the other hand, lead to numerically very good approximations of the corresponding continuous results, when the scale parameter is sufficiently large, in most of the experiments presented in the paper, when the scale parameter is greater than a value of about 1, in units of the grid spacing. Below a standard deviation of about 0.75, the derivative estimates obtained from convolutions with the sampled Gaussian derivative kernels are, however, not numerically accurate or consistent, while the results obtained from the discrete analogue of the Gaussian kernel, with its associated central difference operators applied to the spatially smoothed image data, are then a much better choice.
Journal Article
Mordukhovich Derivatives of Metric Projection Operator in Hilbert Spaces
2024
In this paper, we study the generalized differentiability of the metric projection operator in Hilbert spaces. We find exact expressions for Mordukhovich derivatives (which are also called Mordukhovich coderivatives) for the metric projection operator onto closed balls in Hilbert spaces and positive cones in Euclidean spaces and in real Hilbert space l2. We investigate the connection between Frèchet derivatives, Gâteaux directional derivatives and the Mordukhovich derivatives of the metric projection in Hilbert spaces.
Journal Article