Talk: New Gradient and Hessian Approximation Methods for Derivative-free Optimisation
Abstract: In general, derivative-free optimisation (DFO) uses approximations of first- and second-order information in minimisation algorithms. DFO is found in direct-search, model-based, trust-region and other mainstream optimisation techniques and is gaining popularity in recent years. This work discusses previous results on some particular uses of DFO: the proximal bundle method and the VU-algorithm, and then presents improvements made this year on the gradient and Hessian approximation techniques. These improvements can be inserted into any routine that requires such estimations.
Please forward this information to anyone you think could be interested in this seminar series.
Hoa Bui (Curtin), Minh Dao (Federation Univ.), Alex Kruger (Federation Univ.), Guoyin Li (Univ. NSW), Vera Roshchina (Univ. NSW), Matthew Tam (Univ. Melbourne)