Today social media websites have made the great success, which allow users to upload personal media data and encourage them to annotate media data with free tags. With the rich tags, users can more conveniently access the image content on these websites. However, the tags annotated by users are generally in a random order without recovering any important relevant information to the content of images. To target this issue, we propose a tags ranking scheme to automatically rank the tags with respect to given images by taking both their relevance to this image's visual content and their relationships into account. The proposed framework mainly comprises four stages. First, given a tag query, a set of web images is collected from multiple searching engines to cover the semantic space. Second, initial relevance scores of the tags with respect to a given image's visual content are estimated in Bayesian framework, in which a fused visual similarity is adopted through the linear combination of global and local visual similarities. Third, tags graph is constructed by mining the relationships among tags. At last, a random walk process over tags graph is performed to refine the tags ranking. Experimental results showed that the proposed method is both effective and efficient.