We consider parameter optimization problems which are subject to constraints given by parametrized partial differential equations. Discretizing this problem may lead to a large-scale optimization problem which can hardly be solved rapidly. In order to accelerate the process of parameter optimization we will use a reduced basis surrogate model for numerical optimization. For many optimization methods sensitivity information about the functional is needed. In the following we will show that this derivative information can be calculated efficiently in the reduced basis framework in the case of a general linear output functional and parametrized evolution problems with linear parameter separable operators. By calculating the sensitivity information directly instead of applying the more widely used adjoint approach we can rapidly optimize different cost functionals using the same reduced basis model. Furthermore, we will derive rigorous a-posteriori error estimators for the solution, the gradient and the optimal parameters, which can all be computed online. The method will be applied to two parameter optimization problems with an underlying advection-diffusion equation.