The generalization capacity of support vector machine (SVM) depends largely on the selection of kernel function and its parameters, and penalty factor, which is regarded as model selection of SVM. When various forms of differentiable and loose generalization bounds are considered as the objective functions, the traditional optimization algorithms easily fall into the local optimal solutions, whereas the modern techniques difficultly find out really optimal ones. Recently the empirical error criterion on a validation set is used as a new objective function which is optimized by the classical optimization methods. In this paper, we propose a new SVM model selection based on hybrid genetic algorithm and empirical error minimization criterion. The hybrid genetic method integrates the gradient descent method into the genetic algorithm to search for a better parameter of RBF kernel. The experiments on 13 benchmark datasets demonstrate that our method can work well on some real applications.