doi: 10.17586/2226-1494-2016-16-4-670-677


APPLICATION OF BINARY DESCRIPTORS TO MULTIPLE FACE TRACKING IN VIDEO SURVEILLANCE SYSTEMS

A. L. Oleinik


Read the full article  ';
Article in Russian

For citation: Oleinik A.L. Application of binary descriptors to multiple face tracking in video surveillance systems. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2016, vol. 16, no. 4, pp. 670–677. doi: 10.17586/2226-1494-2016-16-4-670-677

Abstract

Subject of Research. The paper deals with the problem of multiple face tracking in a video stream. The primary application of the implemented tracking system is the automatic video surveillance. The particular operating conditions of surveillance cameras are taken into account in order to increase the efficiency of the system in comparison to existing general-purpose analogs. Method. The developed system is comprised of two subsystems: detector and tracker. The tracking subsystem does not depend on the detector, and thus various face detection methods can be used. Furthermore, only a small portion of frames is processed by the detector in this structure, substantially improving the operation rate. The tracking algorithm is based on BRIEF binary descriptors that are computed very efficiently on modern processor architectures. Main Results. The system is implemented in C++ and the experiments on the processing rate and quality evaluation are carried out. MOTA and MOTP metrics are used for tracking quality measurement. The experiments demonstrated the four-fold processing rate gain in comparison to the baseline implementation that processes every video frame with the detector. The tracking quality is on the adequate level when compared to the baseline. Practical Relevance. The developed system can be used with various face detectors (including slow ones) to create a fully functional high-speed multiple face tracking solution. The algorithm is easy to implement and optimize, so it may be applied not only in full-scale video surveillance systems, but also in embedded solutions integrated directly into cameras.


Keywords: multiple face tracking, binary descriptors, video surveillance, computer vision

Acknowledgements. This work was partially financially supported by the Government of the Russian Federation, Grant 074-U01. The author expresses his sincere appreciation to Professor Georgy Kukharev, his scientific adviser; Yuri Matveev, Head of SIS Department, and Alexander Melnikov for their critical remarks and advice that significantly improved this paper.

References

1. Melnikov A., Akhunzyanov R., Kudashev O., Luckyanets E. Audiovisual liveness detection. Lecture Notes in Computer Science, 2015, vol. 9280, pp. 643–652. doi: 10.1007/978-3-319-23234-8_59
2. Yilmaz A., Javed O., Shah M. Object tracking: a survey. ACM Computing Surveys, 2006, vol. 38, no. 4, pp. 1–45. doi: 10.1145/1177352.1177355
3. Yang H., Shao L., Zheng F., Wang L., Song Z. Recent advances and trends in visual tracking: a review. Neurocomputing, 2011, vol. 74, no. 18, pp. 3823–3831. doi: 10.1016/j.neucom.2011.07.024
4. Smeulders A.W.M., Chu D.M., Cucchiara R., Calderara S., Dehghan A., Shan M. Visual tracking: an experimental survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, vol. 36, no. 7, pp. 1442–1468. doi: 10.1109/TPAMI.2013.230
5. Kristan M., Matas J., et al. The visual object tracking VOT2015 challenge results. Proc. IEEE Int. Conf. on Computer Vision. Santiago, Chile, 2015, pp. 564–586.
6. Matthews I., Ishikawa T., Baker S. The template update problem. IEEE Transactions on Pattern Analysis Machine Intelligence, 2004, vol. 26, no. 6, pp. 810–815. doi: 10.1109/TPAMI.2004.16
7. Minnehan B., Spang H., Savakis A.E. Robust and efficient tracker using dictionary of binary descriptors and locality constraints. Lecture Notes in Computer Science, 2014, vol. 8887, pp. 589–598.
8. Avidan S. Ensemble tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, vol. 29, no. 2, pp. 261–271. doi: 10.1109/TPAMI.2007.35
9. Kalal Z., Mikolajczyk K., Matas J. Tracking-learning-detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, vol. 34, no. 7, pp. 1409–1422. doi: 10.1109/TPAMI.2011.239
10. Broida T.J., Chellappa R. Estimation of object motion parameters from noisy images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1986, vol. PAMI-8, no. 1, pp. 90–99.
11. Isard M., Blake A. CONDENSATION – conditional density propagation for visual tracking. International Journal of Computer Vision, 1998, vol. 29, no. 1, pp. 5–28.
12. Maggio E., Piccardo E., regazzoni C., Cavallaro A. Particle PHD filtering for multi-target visual tracking. IEEE Int. Conf. on Acoustics, Speech and Signal Processing, ICASSP, 2007, vol. 1, art. 4217276. doi: 10.1109/ICASSP.2007.366104
13. Lucas B.D., Kanade T. An iterative image registration technique with an application to stereo vision. Proc. 7th Int. Joint Conference on Artificial Intelligence, IJCAI. Vancouver, Canada, 1981, pp. 674–679.
14. Baker S., Matthews I. Lucas-Kanade 20 years on: a unifying framework. International Journal of Computer Vision, 2004, vol. 56, no. 3, pp. 221–255. doi: 10.1023/B:VISI.0000011205.11775.fd
15. Ta D.-N., Chen W.-C., Gelfand N., Pilli K. SURFTrac: Efficient tracking and continuous object recognition using local feature descriptors. IEEE Conference on Computer Vision and Pattern Recognition, CVPR. Miami, USA, 2009, pp. 2937–2944. doi: 10.1109/CVPRW.2009.5206831
16. Bilinski P., Bremond F., Kaaniche M.-B. Multiple object tracking with occlusions using HOG descriptors and multi resolution images. Proc. 3rd Int. Conf. on Crime Detection and Prevention, ICDP. London, 2009, no. 2. doi: 10.1049/ic.2009.0264
17. Panti B., Monteiro P., Pereira F., Ascenso J. Descriptor-based adaptive tracking-by-detection for visual sensor networks. IEEE Int. Conf. on Multimedia and Expo Workshops, ICMEW. Turin, Italy, 2015. doi: 10.1109/ICMEW.2015.7169807
18. Calonder M., Lepetit V., Strecha C., Fua P. BRIEF: Binary robust independent elementary features. Proc. 11th European Conference on Computer Vision, ECCV. Heraklion, Crete, Greece, 2010, pp. 778–792. doi: 10.1007/978-3-642-15561-1_56
19. Ishii I., Ichida T., Gu Q., Takaki T. 500-fps face tracking system. Journal of Real-Time Image Processing, 2013, vol. 8, no. 4, pp. 379–388. doi: 10.1007/s11554-012-0255-8
20. Kukharev G.A., Kamenskaya E.I., Matveev Y.N., Shchegoleva N.L. Metody Obrabotki i Raspoznavaniya Izobrazhenii Lits v Zadachakh Biometrii [Methods for Face Image Processing and Recognition in Biometric Applications] Ed. M.V. Khitrov. St. Petersburg, Politekhnika Publ., 2013, 388 p.
21. Viola P., Jones M. Rapid object detection using a boosted cascade of simple features. Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR. Kauai, USA, 2001, vol. 1, pp. 511–518.
22. Ahuja R.K., Magnanti T.L., Orlin J.B. Network Flows: Theory, Algorithms, and Applications. Prentice Hall, 1993, 864 p.
23. Rosten E., Drummond T. Machine learning for high-speed corner detection. Lecture Notes in Computer Science, 2006, vol. 3951, pp. 430–443. doi: 10.1007/11744023_34
24. Szeliski R. Computer Vision: Algorithms and Applications. London, Springer-Verlag, 2011, 812 p. doi: 10.1007/978-1-84882-935-0
25. OpenCV (open source computer vision). Available at: http://opencv.org (accessed 09.06.2015).
26. LEMON Graph Library. Available at: https://lemon.cs.elte.hu/trac/lemon (accessed 15.02.2016).
27. Li L., Nawaz T., Ferryman J. PETS 2015: Datasets and challenge. Proc. 12th IEEE Int. Conf. on Advanced Video and Signal Based Surveillance, AVSS. Karlsruhe, Germany, 2015. doi: 10.1109/AVSS.2015.7301741
28. Collins R., Zhou X., Teh S.K. An open source tracking testbed and evaluation web site. IEEE Int. Workshop on Performance Evaluation of Tracking and Surveillance, PETS, 2005, vol. 2, pp. 35.
29. Doermann D., Mihalcik D. Tools and techniques for video performance evaluation. Proc. 15th Int. Conf. on Pattern Recognition, 2000, vol. 15, no. 4, pp. 167–170.
30. Bernardin K., Stiefelhagen R. Evaluating multiple object tracking performance: the CLEAR MOT metrics. EURASIP Journal on Image Video Processing, 2008, vol. 2008, art. 246309. doi: 10.1155/2008/246309
 



Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Copyright 2001-2024 ©
Scientific and Technical Journal
of Information Technologies, Mechanics and Optics.
All rights reserved.

Яндекс.Метрика