A numerical method for rigorous over-approximation of a solution set of an input-affine system whose inputs represent some bounded noise is presented. The method gives high order error for a single time step and a uniform bound on the error over the finite time interval. The approach is based on the approximations of inputs by linear functions at each time step. We derive the single-step error in the one-dimensional case, and give the formula for the error in higher dimensions. As an illustration of the theory presented, a rigorous numerical result is given.