1) strong Wolfe line search
强Wolfe线性搜索
1.
The convergence of the algorithms with strong Wolfe line search is proved.
本文研究一类新的解无约束最优化问题的记忆梯度法,在强Wolfe线性搜索下证明了其全局收敛性。
2.
The authors present a new class of memory gradient methods for unconstraind optimization problems and prove its global convergence under strong Wolfe line searches.
研究一类新的无约束优化记忆梯度算法,并在强Wolfe线性搜索下证明了其全局收敛性。
3.
The formula satisfies sufficient descent condition under the strong Wolfe line search.
针对PRP方法对一般的非凸函数在强Wolfe线性搜索条件下不收敛这一不足,给出了一种新的共轭梯度算法。
2) strong Wolfe line search
强wolfe线搜索
1.
The algorithm uses the linear combination of negative gradient and its previous search direction as a search direction,and uses strong Wolfe line search to the step-size.
研究一类新的记忆梯度法,算法利用当前点的负梯度和前一点的搜索方向的线性组合为搜索方向,以强wolfe线搜索确定步长,并证明了算法具有全局收敛性,当目标函数一致凸时讨论了收敛速度。
4) Wolfe line search
Wolfe线搜索
1.
A new conjugate gradient method is proposed for solving unconstrained optimization problems to update and prove the method with Wolfe line search convergece globally.
提出了求解无约束优化问题的一种新的共轭梯度法,修正了βk,并在Wolfe线搜索下证明了它的全局收敛性。
2.
The convergence of the new methods is proved under the Wolfe line search without the descent condition.
本文对求解无约束优化问题提出一类三项混合共轭梯度算法,新算法将Hestenes- stiefel算法与Dai-Yuan方法相结合,并在不需给定下降条件的情况下,证明了算法在Wolfe线搜索原则下的收敛性,数值试验亦显示出这种混合共轭梯度算法较之HS和PRP的优势。
3.
Under mild conditions, we prove that the method possesses descent property and is global convergence with the strong Wolfe line search.
在适当的条件下,证明了算法具有下降性质,并且在采用强Wolfe线搜索时,算法是全局收敛的。
5) Wolfe-PoweU line search
Wolfe-Powell线性搜索
6) Wolfe-Powell line search
Wolfe-Powell型线性搜索
补充资料:椐椐强强
1.相随貌。
说明:补充资料仅用于学习参考,请勿用于其它任何用途。
参考词条