Optimality Conditions for Nonsmooth Optimization and Mordukhovich Subdifferentials

No Thumbnail Available

Date

2012-01

Journal Title

Journal ISSN

Volume Title

Publisher

Addis Ababa University

Abstract

The differentiability assumptions plays a vital role in nonlinear programming, because most of methods of finding the optimum point in non linear programming starts by finding the gradient of the function and then the stationary points. For unconstrained optimization problems, checking the Positive definiteness of the Hessian matrix at stationary points, one can conclude whether those stationary points are optimum points or not if the objective function is differentiable. Similarly, if the objective function and functions in the constraint set are differentiable, the well known optimality condition called Karush Kuhn Tucker (KKT) condition leads to find the optimum point(s) of the given optimization problem. But, since finding the gradient of the function for non-differentiable functions is not possible, we treat the problem by finding the subgradient, the directional derivative, finding the Mordukhovich normal cone depending on the convexity of the function. Consequently, the optimization procedures for the optimization problems on which functions in the problem are not differentiable is different from the optimization procedures for the optimization problems in which the objective function as well as functions in constraints are differentiable. This project focuses on finding the optimality conditions for optimizations problems without any differentiability assumptions. The subgradient and directional derivative approach are used to solve nonsmooth optimization problem of convex type; and the Mordukhovich exremal principle is applied to solve nonsmooth optimization problems of non convex type

Description

Keywords

Optimality Conditions, for Nonsmooth Optimization

Citation

Collections