In this paper, we explained the current matrix completion theory and proposed a new matrix completion framework called Feature Vector and Function Approximating Based Matrix Completion (FVFABMC) which extended low-rank matrix completion theory. The new matrix completion problem can be decomposed into two learning problems, feature vector learning problem and synthetic function learning problem based on the feature vector matrix. The global optimal solution for feature vectors can be obtained by only assuming synthetic function is smoothing locally which makes first-order approximation of feature vector learning problem as convex semi-definite programming problem. To solve the large-scale feature vector learning problem, we also proposed a stochastic parallel gradient descent blocks algorithm. For the matrix synthetic function learning problem, according to local linear hypothesis, the problem can be formalized in to an unconstrained least squares problem based on local neighboring coefficients which avoid model selection and parameter learning difficulties. Numerical experiments show that the feasibility of FVFABMC method in learning feature vectors and had a good prediction performance on missing elements of utility matrix.