Consider an optimization problem where the objective function is an integral containing the solution of a system of ordinary differential equations. Suppose we have efficient optimization methods available as well as efficient methods for initial value problems for ordinary differential equations. The main purpose of this paper is to show how these methods can be efficiently applied to a considered problem. First, the general procedures for the evaluation of gradients and Hessian matrices are described. Furthermore, the new efficient Gauss-Newton-like approximation of the Hessian matrix is derived for the special case when the objective function is an integral of squares. This approximation is used for deriving the Gauss-Newton-like trust region method, with which global and superlinear convergence properties are proved. Finally several optimization methods are proposed and computational experiments illustrating their efficiency are shown.
49M15, 90C30, 65K10, 49J15, 90C48