Future humanoid robots will need the capability to grasp and manipulate arbitrary objects in order to assist people in their homes, to interact with them and with the environment. In this work, we present an approach to grasp known objects. Our approach consists of an offline step for grasp planning, a rating step which determines the human likeness of the grasps and an execution step, where the most suitable grasp is performed on a humanoid robot. We especially focus on the rating step where we use human grasping data to rate pre-computed grasp hypotheses from our grasp planner in order to select the most human-like feasible grasp for execution on the real robot. We present the details of our method together with experiments on our ARMAR-III humanoid robot.