Journal of Optimization Theory and Applications | Vol., Issue. | 2020-05-09 | Pages 1-21
Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization
A new quasi-Newton method with a diagonal updating matrix is suggested, where the diagonal elements are determined by forward or by central finite differen
Original Text (This is the original text for your reference.)
Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization
A new quasi-Newton method with a diagonal updating matrix is suggested, where the diagonal elements are determined by forward or by central finite differen
+More
APA
MLA
Chicago
Neculai Andrei,.Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization. (),1-21.
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Gilbert, J.C., Lemaréchal, C.: Some numerical experiments with variable-storage quasi-Newton algorithms. Math. Programming Ser. B 45, 407–435 (1989)
Nash, S.G., Nocedal, J.: A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization. SIAM J. Optimization 1(3), 358–372 (1991)
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numerical Algorithms 42, 63–73 (2006)
Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)
Andrei, N.: A diagonal quasi-Newton updating method for unconstrained optimization. Numerical Algorithms 81, 575–590 (2019)
Andrei, N.: A Collection of 75 Unconstrained Optimization Test Problems. Technical Report No. 6/2018, May 10, 2018, pp. 1–9. Research Institute for Informatics, Bucharest (2018)
Zhu, M., Nazareth, J.L., Wolkowicz, H.: The quasi-Cauchy relation and diagonal updating. SIAM J. Optim. 9(4), 1192–1204 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Nash, S.G.: User’s Guide for TN/TNBC: Fortran Routines for Nonlinear Optimization. Report 397, Mathematical Sciences Department, The Johns Hopkins University, Baltimore (1984)
Bartholomew-Biggs, M.: Nonlinear Optimization with Engineering Applications. Springer Optimization and Its Applications, vol. 19. Springer, Berlin (2008)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Programming 45, 503–528 (1989)
Luenberger, D.G., Ye, Y.: Linear and Nonlinear Programming. International Series in Operations Research & Management Science, 4th edn. Springer, New York (2016)
Nazareth, J.L.: The Newton–Cauchy Framework: A Unified Approach to Unconstrained Nonlinear Minimization. Lecture Notes in Computer Science 769. Springer, New York (1994)
Dennis, J.E., Walker, H.F.: Inaccuracy in quasi-Newton methods: local improvement theorems. In: Mathematical Programming Study, Vol. 22: Mathematical Programming at Oberwolfach II, pp. 70–85. North-Holland, Amsterdam (1984)
Dennis Jr., J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Review 19, 46–89 (1977)
Andrei, N.: Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology. Springer Optimization and Its Applications, vol. 121. Springer, Berlin (2017)
Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, London (1970)
Ypma, T.J.: The effect of rounding errors on Newton-like methods. IMA J. Numer. Anal. 3, 109–118 (1983)
Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numerica 1, 199–242 (1992)
Farid, M., Leong, W.J., Zheng, L.: A new diagonal gradient-type method for large scale unconstrained optimization. U.P.B. Sci. Bull. Ser. A 75(1), 57–64 (2013)
Dennis Jr., J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Inc., Englewood Cliffs, NJ (1983)
Leong, W.J., Farid, M., Hassan, M.A.: Improved Hessian approximation with modified quasi-Cauchy relation for a gradient-type method. Adv. Model. Optim. 12(1), 37–44 (2010)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Review 11, 226–235 (1969)
Gill, P.E., Murray, W., Wright, M.H.: Practical Optimization. Academic Press, London (1981)
Dennis, J.E., Wolkowicz, H.: Sizing and least-change secant methods. SIAM J. Numerical Analysis 30, 1291–1314 (1993)
Gill, P.E., Murray, W.: Conjugate Gradient Methods for Large-Scale Nonlinear Optimization. Technical Report SOL 79-15, Department of Operations Research, Stanford University, Stanford (1979)
Nash, S.G.: Preconditioning of truncated-Newton methods. SIAM J. Sci. Statist. Comput. 6, 599–616 (1985)
Dennis Jr., J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Mathematics of Computation 28, 549–560 (1974)
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2006)
Kelley, C.T.: Iterative Methods for Linear and Nonlinear Equations. SIAM, Philadelphia (1995)
Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM TOMS 21, 123–160 (1995)
Nazareth, J.L.: If quasi-Newton then why not quasi-Cauchy? SIAG/Opt Views-and-news 6, 11–14 (1995)
Select your report category*
Reason*
New sign-in location:
Last sign-in location:
Last sign-in date: