Information retrieval systems typically use the vector space model in order to represent both the queries and the set of documents. Afterwards they use the cosine distance for ordering the documents that best match the search query. The same cosine distance is used in order to compute the distance between two documents represented in the vector space. This article proposes a new approach, in which both the documents and the search query are represented as a time series of words. As a consequence, the document is no longer seen as a ‘bag of words’ but as a sequence of words in which the order is essential. On top of this model we will use a well known algorithm in time series, dynamic time warping (DTW), in order to compute the distance between a search query and a document or between two documents. In order to improve the model, we enhance the algorithm to take into consideration the semantic context by using WordNet when computing the distance between words. This semantic enablement integrates smoothly into the time series model that we propose and it permits us to go beyond the ‘bag of words’ model into the semantic search area.