Volume 53, Number 2, April-June 2019
|Page(s)||657 - 666|
|Published online||13 June 2019|
Rapidly convergent Steffensen-based methods for unconstrained optimization
Department of Mathematics, Tafresh University, Tafresh, Iran
* Corresponding author: email@example.com
Accepted: 23 May 2017
A problem with rapidly convergent methods for unconstrained optimization like the Newton’s method is the computational difficulties arising specially from the second derivative. In this paper, a class of methods for solving unconstrained optimization problems is proposed which implicitly applies approximations to derivatives. This class of methods is based on a modified Steffensen method for finding roots of a function and attempts to make a quadratic model for the function without using the second derivative. Two methods of this kind with non-expensive computations are proposed which just use first derivative of the function. Derivative-free versions of these methods are also suggested for the cases where the gradient formulas are not available or difficult to evaluate. The theory as well as numerical examinations confirm the rapid convergence of this class of methods.
Mathematics Subject Classification: 65K10 / 90C53
Key words: Unconstrained optimization / derivative-free / newton’s method / Steffensen’s method
© EDP Sciences, ROADEF, SMAI 2019
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.