Let H be a real Hilbert space and let <..,.> denote the corresponding scalar product. Given a $$\mathcal{C}^2$$ function $$\Phi :H \to \mathbb{R}$$ that is bounded from below, we consider the following dynamical system: $$( {\text{SDC) }}\dot x(t) + \lambda (x(t))\triangledown \Phi (x(t)) = 0,{\text{ }}t \geqslant 0,$$ where λ(x) corresponds to a quadratic approximation to a linear search technique in the direction −∇Φ(x). The term λ(x) is connected intimately with the normal curvature radius ρ(x) in the direction ∇Φ(x). The remarkable property of (SDC) lies in the fact that the gradient norm |∇Φ(x(t))| decreases exponentially to zero when t→+∞.
When Φ is a convex function which is nonsmooth or lacks strong convexity, we consider a parametric family {Φε, ε>0} of smooth strongly convex approximations of Φ and we couple this approximation scheme with the (SDC) system. More precisely, we are interested in the following dynamical system: $$( {\text{ASDC) }}\dot x(t) + \lambda (t,x(t))\triangledown _x \Phi (t,x(t)) = 0,{\text{ }}t \geqslant 0,$$ where λ(t, x) is a time-dependent function involving a curvature term. We find conditions on the approximating family and on ε(⋅) ensuring the asymptotic convergence of the solution trajectories x(⋅) toward a particular solution of the problem min {Φ(x), x∈H}. Applications to barrier and penalty methods in linear programming and to viscosity methods are given.