In this paper, we consider the problem of detection of signals in photon-counting free space optical systems. The underlying hypothesis testing problem is modeled as a goodness-of-fit test (GoFT), and the performance comparison of goodness-of-fit techniques against Poisson count variables are studied. This approach is particularly useful in scenarios where the knowledge of exact statistics of the fading due to atmospheric turbulence is not available. Also, evaluation of the distribution of the test statistic and the optimal detection threshold for a GoFT is easier as compared to the existing techniques. Moreover, the existing techniques have large computational complexity. Through extensive simulations, we show that among the GoFTs studied, namely, Fisher, Anscombe, Neyman-Scott, deviance and Katz tests, the Neyman-Scott test outperforms all the other tests, while the Katz test provides a lower bound on the performance.