Conjugate Gradient Method and its Extensions

dc.contributor.advisorGuta, Berhanu (PhD)
dc.contributor.authorTibebu, Zewdie
dc.date.accessioned2019-04-23T11:55:08Z
dc.date.accessioned2023-11-18T12:44:19Z
dc.date.available2019-04-23T11:55:08Z
dc.date.available2023-11-18T12:44:19Z
dc.date.issued2018-05-01
dc.description.abstractIn this project we investigate conjugate gradient method and its extension to solve unconstrained minimization problems. There are two important methods for solving linear equations and nonlinear optimization problems. The performance of the linear conjugate gradient method is tied to the distribution of the eigenvalues of the coe cient matrix. Nonlinear conjugate gradient method is used for solving large-scale nonlinear optimization problems and has wide applications in many elds. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiere conjugate gradient methods. And comparisons are made among the algorithms of the steepest descent method, Newton's method and conjugate gradient method for quadratic and nonquadratic problems.en_US
dc.identifier.urihttp://etd.aau.edu.et/handle/12345678/18128
dc.language.isoenen_US
dc.publisherAddis Ababa Universityen_US
dc.subjectPreliminariesen_US
dc.subjectRate of Convergenceen_US
dc.subjectKrylov Subspacesen_US
dc.titleConjugate Gradient Method and its Extensionsen_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Zewdie Tibebu 2018.pdf
Size:
769.99 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Plain Text
Description: