We start from a variable x, which has an unspecified (and possibly even infinite-variance) distribution, and we truncate x from above and below with bounds that may linearly depend on a second variable, y. We investigate how the variance of this truncated variable is affected by a binomial version of the Rotschild-Stiglitz measure of increased riskiness of x or y. We find that, for most unimodel distributions of x, such an increase in the riskiness of x increases the variance of the truncated variable. The effect of changed riskiness in y is ambiguous.