We present results of computational experiments with an extension of the Perceptron algorithm by a special type of simulated annealing. The simulated annealing procedure employs a logarithmic cooling schedule $$c\left( k \right) = \Gamma /\ln \left( {k + 2} \right)$$ , where $$\Gamma$$ is a parameter that depends on the underlying configuration space. For sample sets S of n-dimensional vectors generated by randomly chosen polynomials $$w_1 \cdot x_1^{a_1 } + \cdot \cdot \cdot + w_n \cdot x_n^{a_n } \geqslant \vartheta$$ , we try to approximate the positive and negative examples by linear threshold functions. The approximations are computed by both the classical Perceptron algorithm and our extension with logarithmic cooling schedules. For $$n{\text{ = }}{\kern 1pt} {\kern 1pt} {\text{256,}}...{\text{,1024}}$$ and $$a_i = 3,...,7$$ , the extension outperforms the classical Perceptron algorithm by about 15% when the sample size is sufficiently large. The parameter Γ was chosen according to estimations of the maximum escape depth from local minima of the associated energy landscape. $$\Gamma $$