A doctor palpating with his or her fingertips can easily determine whether a patient's tissue contains a solid tumor, regardless of extraneous factors such as the patient's pose. We want to create automatic palpation tools that achieve this same level of robustness, but real tactile sensor signals are sensitive to small physical variations and tend to drift over time. To investigate ways of achieving tactile perceptual invariance, we collected two nearly identical data sets from a SynTouch BioTac palpating at 10,260 points distributed across an artificial tissue sample made from silicone rubber with rigid embedded lumps. Each palpation interaction was distilled down to 20 numbers (19 electrode impedances and one DC pressure). Fitting a multivariate Gaussian distribution to examples recorded far from the lumps achieved near-perfect accuracy, precision, and recall in recognizing other background examples recorded the same day. These models performed miserably, though, when used to classify the other data set. Inspired by robust perceptual methods from computer vision, we transformed the 20 tactile sensor array readings into 190 binary pairwise comparisons and used the log likelihood of observing a given binary pattern to determine whether an example was far from the embedded lumps. This novel approach achieved accuracy, precision, and recall of about 80% on reserved testing data from the same day and yielded roughly similar levels of accuracy and recall with excellent precision on the data set recorded on the other day. Pairwise comparisons between tactile sensor array readings may hold promise for achieving robust automatic tactile perception.