INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIII, Issue XI, November 2024
www.ijltemas.in Page 130
On Iterative Methods in Optimization
Henrietta Nkansah
1
*
& Bismark Kwao Nkansah
2
1
Department of Mathematics, University of Cape Coast, Ghana
2
Department of Statistics, University of Cape Coast, Ghana
Corresponding Author
DOI : https://doi.org/10.51583/IJLTEMAS.2024.1311014
Received: 18 November 2024; Accepted: 30 November 2024; Published: 16 December 2024
Abstract: The paper highlights a failure in the implementation of a recommendation for the modified Newton’s method using a
Rosenbrock type of functions that have slow convergence with two minimum points as test functions. The study finds that a
recommended procedure, if the Hessian
at a point is not positive definite, may not lead to the desired optimal solution
particularly when the initial point is not close enough to the expected solution. It has been demonstrated how to go round this
problem. The results show that more than one technique may be required to determine all critical points of a given function.
Keywords: Optimization, Rosenbrock’s function, Modified Newton’s Method, Descent direction
I. Introduction
In some gradient-based unconstrained optimization techniques a non-positive definite Hessian matrix of the problem function
possesses challenges for convergence of the solution. For the Newton’s method in particular, various modifications have been
suggested (e.g., Fiacco & McCormick, 1967; Marquardt, 1963) that incorporates the Steepest Descent (SD) method to obtain a
new direction that probably will help to get to the minimum.
The convergence rate of an optimization problem may be influenced by a number of factors that includes the role of the
underlying optimization method. Others involves the global or local nature of the convergence (Lewis & Nash, 2006). For studies
on convergence, a typical test function of the type of the Rosenbrock’s function (Emiola & Adem, 2021) comes handy for
studying robustness of gradient-based optimization algorithms.
In Section 2, we provide a description of the illustrative problem functions and review how the upper bound on the convergence
rate, expressed as a function of the level of ill-conditioning, poses a challenge for the optimization process. It is known that
obtaining the desired critical point depends on the initial point. We demonstrate in this paper that with the appropriate search
direction, it is possible to reach the desired optimal point even with a (reasonably) distant starting point and for highly ill-
conditioned problem function for the Newton’s method. In Section 3, the Newton’s method as well as the Modified Newton’s
method is presented. In the process, the problem of interest of the study is highlighted that points out the failure in convergence in
spite of known recommendations in the literature and prescribes a remedy. Throughout the implementations, we set a tolerance of
. In Section 4, the summary of the proposed procedure is
presented and followed by the conclusion.
II. Illustrative Functions and Effect of Ill-Conditioning on Convergence
Two main types of functions are used in this paper. The functions are selected to illustrate the effect of ill-conditioning on the
determination of optimal points.
Test Function I
We consider the problem of minimizing the function