000033723 001__ 33723
000033723 005__ 20190316233408.0
000033723 0247_ $$2doi$$a10.5075/epfl-thesis-3239
000033723 02470 $$2urn$$aurn:nbn:ch:bel-epfl-thesis3239-5
000033723 02471 $$2nebis$$a4919704
000033723 037__ $$aTHESIS
000033723 041__ $$aeng
000033723 088__ $$a3239
000033723 245__ $$aTrust-region methods based on radial basis functions with application to biomedical imaging
000033723 269__ $$a2005
000033723 260__ $$aLausanne$$bEPFL$$c2005
000033723 300__ $$a132
000033723 336__ $$aTheses
000033723 502__ $$aAndrew Conn, Thomas Liebling, Thomas Mountford, Annick Startenaer, Michaël Unser
000033723 520__ $$aWe have developed a new derivative-free algorithm based on Radial Basis Functions (RBFs). Derivative-free optimization is an active field of research and several algorithms have been proposed recently. Problems of this nature in the industrial setting are quite frequent. The reason is that in a number of applications the optimization process contains simulation packages which are treated as black boxes. The development of our own algorithm was originally motivated by an application in biomedical imaging: the medical image registration problem. The particular characteristics of this problem have incited us to develop a new optimization algorithm based on trust-region methods. However it has been designed to be generic and to be applied to a wide range of problems. The main originality of our approach is the use of RBFs to build the models. In particular we have adapted the existing theory based on quadratic models to our own models and developed new procedures especially designed for models based on RBFs. We have tested our algorithm called BOOSTERS against state-of-the-art methods (UOBYQA, NEWUOA, DFO). On the medical image registration problem, BOOSTERS appears to be the method of choice. The tests on problems from the CUTEr collection show that BOOSTERS is comparable to, but not better than other methods on small problems (size 2-20). It is performing very well for medium size problems (20-80). Moreover, it is able to solve problems of dimension 200, which is considered very large in derivative-free optimization. We have also developed a new class of algorithms combining the robustness of derivative-free algorithms with the faster rate of convergence characterizing Newtonlike-methods. In fact, they define a new class of algorithms lying between derivative-free optimization and quasi-Newton methods. These algorithms are built on the skeleton of our derivative-free algorithm but they can incorporate the gradient when it is available. They can be interpreted as a way of doping derivative-free algorithms with derivatives. If the derivatives are available at each iteration, then our method can be seen as an alternative to quasi-Newton methods. At the opposite, if the derivatives are never evaluated, then the algorithm is totally similar to BOOSTERS. It is a very interesting alternative to existing methods for problems whose objective function is expensive to evaluate and when the derivatives are not available. In this situation, the gradient can be approximated by finite differences and its costs corresponds to n additional function evaluations assuming that Rn is the domain of definition of the objective function. We have compared our method with CFSQP and BTRA, two gradient-based algorithms, and the results show that our doped method performs best. We have also a theoretical analysis of the medical image registration problem based on maximization of mutual information. Most of the current research in this field is concentrated on registration based on nonlinear image transformation. However, little attention has been paid to the theoretical properties of the optimization problem. In our analysis, we focus on the continuity and the differentiability of the objective function. We show in particular that performing a registration without extension of the reference image may lead to discontinuities in the objective function. But we demonstrate that, under some mild assumptions, the function is differentiable almost everywhere. Our analysis is important from an optimization point of view and conditions the choice of a solver. The usual practice is to use generic optimization packages without worrying about the differentiability of the objective function. But the use of gradient-based methods when the objective function is not differentiable may result in poor performance or even in absence of convergence. One of our objectives with this analysis is also that practitioners become aware of these problems and to propose them new algorithms having a potential interest for their applications.
000033723 700__ $$0(EPFLAUTH)106935$$aOeuvray, Rodrigue$$g106935
000033723 720_2 $$0240563$$aBierlaire, Michel$$edir.$$g118332
000033723 8564_ $$s733919$$uhttps://infoscience.epfl.ch/record/33723/files/EPFL_TH3239.pdf$$yTexte intégral / Full text$$zTexte intégral / Full text
000033723 909C0 $$0252123$$pTRANSP-OR$$xU11418
000033723 909C0 $$0252055$$pROSO
000033723 909CO $$ooai:infoscience.tind.io:33723$$pENAC$$pthesis$$pthesis-bn2018$$pDOI$$qDOI2$$qGLOBAL_SET
000033723 918__ $$aSB$$bSB-SMA$$cIMA
000033723 919__ $$aTRANSP-OR
000033723 919__ $$aROSO
000033723 920__ $$a2005-5-19$$b2005
000033723 970__ $$a3239/THESES
000033723 973__ $$aEPFL$$sPUBLISHED
000033723 980__ $$aTHESIS