The performance of Nearest Neighbor (NN) classifier is highly dependent on the distance function used to find the NN of an input test pattern. Many of the proposed algorithms try to optimize the accuracy of the NN rule using a weighted distance function. Here, in the proposed method the distance function is defined in a parametric form to incorporate the local relevancy of the features in the decision boundary of the prototype. The local weight of each feature is determined according to the amount of information it provides about discrimination of different classes for each prototype. In this method a novel learning algorithm tunes the weight vector of the prototypes. The learning method uses an entropy based objective function that is optimized by a gradient-descent technique. A new entropy measure is proposed in which the decision boundary of a prototype is a fuzzy region. We show that our scheme has comparable or better performance than some recent methods proposed in the literature.