The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
In this paper, we propose two modified conjugate gradient methods, which produce sufficient descent direction at every iteration. The theoretical analysis shows that the algorithms are global convergence under some suitable conditions. The numerical results show that both algorithms are efficient for the given test problems from the Matlab library.
The memory gradient method is used for unconstrained optimization, especially large scale problems. In this paper, we develop a nonmonotone memory gradient method for unconstrained optimization, where a class of memory gradient direction is combined efficiently. The global and Rlinear convergence is obtained by using a nonmonotone line search strategy and the numerical tests are also given to show...
A modified DY conjugate gradient algorithm with sufficient descent is proposed in this paper. Furthermore under the Wolfe line search conditions, we prove that the new algorithm can support the global convergence. The initial numerical experiments show that the new algorithm is efficient.
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.