Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    New user? Click here to register. Have you forgotten your password?
Repository logo
  • Colleges, Institutes & Collections
  • Browse AAU-ETD
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Tafa, Meta"

Now showing 1 - 1 of 1
Results Per Page
Sort Options
  • No Thumbnail Available
    Item
    Optimality Condition for Smooth Constrained Optimization Problems
    (Addis Ababa University, 2020-08-30) Tafa, Meta; Guta, Berhanu (PhD)
    Mathematical optimization is the process of maximizing or minimizing an objective function by _nding the best available values across a set of inputs under a restriction domain. This project focuses on _nding the optimality condition for optimization problems of di_erentiable function. For unconstrained optimization problems, checking the positive de_niteness of the Hessian matrix at stationary points, one can conclude whether those stationary points are optimum points or not, if the objective function is di_erentiable. For constrained Optimization problem, the objective function and the function in the constraint sets are di_erentiable and the well known optimality condition called Karush-Kuhn-Tucker (KKT) condition leads to _nd the optimum point(s) of the given optimization problem and the convetional Lagrangian approach to solving constrained optimization problems leads to optimality conditions which are either necessary or su_cient, but not both unless the underlying objective functions and functions in constraints set are also convex. The Tchebyshev norm leads to an optimality conditions which is both su_cient and necessarly without any convexity assumption.This optimality conditions can used to device a conceptually simple method for solving non-convex inequality constrained optimization problems.

Home |Privacy policy |End User Agreement |Send Feedback |Library Website

Addis Ababa University © 2023