This paper presents a simple algorithm for adaptively stabilizing a linear one-dimensional discrete time process. Prior knowledge of the sign of the process high frequency gain is not required, and the adaptive control law is a continuous function of its arguments. Performance of the algorithm is considered and a simple modification is suggested to improve algorithm convergence rate while preserving system stability.