This paper addresses the specific problem of human event detection from a video sequence in both indoor and outdoor environments. Foreground image pixels are identified through the principle of background subtraction by defining a reference background model using a mixture of time varying Gaussian distributions. Color filtering in the RGB space is then used to remove image distortions due to camera effects and shadowing. A novel approach to tackle the issue of sudden foreground bursts that appear as a result of impulsive environmental changes is also embedded in to the foreground segmentation algorithm. Objects are tracked throughout its presence in the video using an assignment problem based tracker which is capable of handling multiple object interactions such as merges, splits, re-appearances and disappearances. A feature space for each object is constructed and is refined using a Kaiman filter. A fusion of multiple features is used to obtain feature trajectories that closely represent real feature variations of objects. An important aspect of the proposed method is its ability to operate and produce satisfactory results in a scene where there are dynamic background changes and complex inter-human interactions.