This paper presents a no reference method for efficient detection of dropped video frames in live video streaming. Temporal information of video frames is calculated and is used to detect dropped video frames. The videos are analyzed in binary form instead of true color format RGB in order to reduce the computational time. The proposed method uses two separate thresholds for videos containing high and low motion content. The temporal information of videos is examined using the defined thresholds and then results are combined to calculate the total number of dropped frames. According to the existing literature, no reference methods are unable to correctly perceive low motion content in video frames and falsely classify them as dropped or frozen frames. Our method overcomes this problem as it uses separate thresholds for high and low motion content. Furthermore, as our method uses video frames in binary format, so it offers low computational time as compared to other contemporary methods. The simulation results and comparison show that our method offers high accuracy with low computation time.