Time-lag in Derivative Convergence for Fixed Point Iterations
In an earlier study it was proven and experimentally confirmed on a 2D Euler code that fixed point iterations can be differentiated to yield first and second order derivatives of implicit functions that are defined by state equations. It was also asserted that the resulting approximations for reduced gradients and Hessians converge with the same R-factor as the underlying fixed point iteration. A closer look reveals now that nevertheless these derivative values lag behind the functions values in that the ratios of the corresponding errors grow proportional to the iteration counter or its square towards infinity. This rather subtle effect is caused mathematically by the occurrence of nontrivial Jordan blocks associated with degenerate eigenvalues. We elaborate the theory and report its confirmation through numerical experiments.
Record created on 2011-05-05, modified on 2016-08-09