Recent events highlight the need for technologies that provide timely identification and geo-location of objects lost within a geographic area that could span tens of thousands of square kilometers over both land and water. Mission critical systems for search and rescue require technologies that provide timely identification and geolocation of objects within these vast geographic areas. Even though there are large quantities of Intelligence, Surveillance, Reconnaissance (ISR) data available from sensors on collection platforms, analysts cannot keep pace with the growing amount of sensor data. No tool exists today that funnels that data into the location and recognition information that overcomes these limitations and meets mission critical requirements. This paper introduces a system, called MiData (Multifactor Information Distributed Analytics Technology Aide) Application to Local / Regional / Global Joined Object Recognition (MAJOR), to meet this need. MAJOR applies sensors and analytics technology in a new way to create a novel capability to rapidly screen massive collections of sensor images (still and video) from multiple and diverse databases in order to chip out and fuse Essential Elements of Information (EEIs) that will transform raw data into actionable information from which analysts can locate lost objects in arbitrary geographic locations in a timely manner.