Pervasive computing applications have long used contact free object detection to process images collected from smartphone, tablet and wearable cameras. Two key challenges encountered in uncontrolled environments are the detection accuracy and speed, more so on computationally limited embedded systems. Recent technological advances have led to the emergence of depth sensors being integrated in a variety of embedded devices such as smartphones and wearables. Depth sensors are contact free and avoid some of the pitfalls caused by variations in lighting and background conditions that are typically associated with images from conventional cameras. This paper presents QuickFind, a fast and lightweight algorithm for object detection using only depth data. QuickFind is fast and particularly suited to run on embedded platforms as it uses a quick method for segmentation and makes use of low overhead features. We demonstrate empirically the performance of QuickFind by benchmarking on two embedded systems: Raspberry Pi and Intel Edison. We also demonstrate the wide applicability of QuickFind by developing two proof-of-concept pervasive computing applications.