Abstract

Mixing-length models are often used by engineers in order to take into account turbulence phenomena in a flow. This kind of model is obtained by adding a turbulent viscosity to the laminar one in Navier-Stokes equations. When the flow is confined between two close walls, von Karman's model consists of adding a viscosity which depends on the rate of strain multiplied by the square of distance to the wall. In this short paper, we present a mathematical analysis of such modeling. In particular, we explain why von Karman's model is numerically ill-conditioned when using a finite element method with a small laminar viscosity. Details of analysis can be found in [1], [2].

Details

Actions