A web crawler forms the backbone of a search engine and this backbone needs a careful re- assessment that could enhance the efficiency of search engines. This paper conducts such a re- assessment from the perspective of systems and this is achieved through implementation and analysis of a web crawler "VisionerBOT" as a feed forward engine for search engines using the MapReduce distributed programming model. Our crawler implementations revisit the classical OS debate of threads vs. events, with a significant contribution from our work which concludes that events is the ideal way forward for web crawlers. Furthermore, in implementing the feed forward mechanisms within the web crawler, we came up with some important design considerations for the operating system research community which can lead to a whole new class of operating systems.