TRANSFORMATION ALGORITHM FOR IMAGES OBTAINED BY OMNIDIRECTIONAL CAMERAS
Read the full article ';
For citation: Lazarenko V.P., Djamiykov T.S., Korotaev V.V., Yaryshev S.N. Transformation algorithm for images obtained by omnidirectional cameras. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2015, vol.15, no. 1, pp. 30–39 (in Russian)
Omnidirectional optoelectronic systems find their application in areas where a wide viewing angle is critical. However, omnidirectional optoelectronic systems have a large distortion that makes their application more difficult. The paper compares the projection functions of traditional perspective lenses and omnidirectional wide angle fish-eye lenses with a viewing angle not less than 180°. This comparison proves that distortion models of omnidirectional cameras cannot be described as a deviation from the classic model of pinhole camera. To solve this problem, an algorithm for transforming omnidirectional images has been developed. The paper provides a brief comparison of the four calibration methods available in open source toolkits for omnidirectional optoelectronic systems. Geometrical projection model is given used for calibration of omnidirectional optical system. The algorithm consists of three basic steps. At the first step, we calculate he field of view of a virtual pinhole PTZ camera. This field of view is characterized by an array of 3D points in the object space. At the second step the array of corresponding pixels for these three-dimensional points is calculated. Then we make a calculation of the projection function that expresses the relation between a given 3D point in the object space and a corresponding pixel point. In this paper we use calibration procedure providing the projection function for calibrated instance of the camera. At the last step final image is formed pixel-by-pixel from the original omnidirectional image using calculated array of 3D points and projection function. The developed algorithm gives the possibility for obtaining an image for a part of the field of view of an omnidirectional optoelectronic system with the corrected distortion from the original omnidirectional image. The algorithm is designed for operation with the omnidirectional optoelectronic systems with both catadioptric and fish-eye lenses. Experimental results are presented.
Acknowledgements. Работа выполнена при государственной финансовой поддержке ведущих университетов Российской Федерации (субсидия 074-U01).
1. Yarishev S., Konyahin I.A., Timofeev A.N. Universal opto-electronic measuring modules in distributed measuring systems. Proceedings of SPIE - The International Society for Optical Engineering, 2009, vol. 7133, art. 71333Y. doi: 10.1117/12.821251
2. Konyahin I.A., Timofeev A.N., Yarishev S.N. High precision angular and linear mesurements using universal opto-electronic measuring modules in distributed measuring systems. Key Engineering Materials, 2010, vol. 437, pp. 160–164. doi: 10.4028/www.scientific.net/KEM.437.160
3. Korotaev V.V., Konyahin I.A., Timofeev A.N., Yarishev S.N. High precision multimatrix optic-electronic modules for distributed measuring systems. Proceedings of SPIE - The International Society for Optical Engineering, 2010, vol. 7544, art. 75441E. doi: 10.1117/12.886294
4. Rusinov M. Tekhnicheskaya Optika: Uchebnoe Posobie [Technical Optics: Textbook]. Leningrad, Mashinostroenie Publ., 1979, 488 p.
5. Lazarenko V., Yarishev S. The algorithm for transforming a hemispherical field-of-view image. Proc. 3rd Int. Topical Meeting on Optical Sensing and Artificial Vision, OSAV'2012. St. Petersburg, Russia, 2012, pp. 35– 38.
6. Lazarenko V.P., Yarishev S.N. Algoritm transformatsii izobrazhenii s polusfericheskim polem zreniya [Algorithm for image transformation with a hemispherical field of view]. Aktual'nye Teoreticheskie i Prakticheskie Voprosy Sovremennogo Optiko-Elektronnogo Priborostroeniya [Actual Theoretical and Practical Aspects of Modern Opto-Electronic Instrumentmaking]. Ed. V.V. Korotaev. St. Petersburg, NRU ITMO, 2012, pp. 103–105.
7. Schwalbe E. Geometric modelling and calibration of fisheye lens camera systems. Panoramic Photogrammetry Workshop, 2005, vol. 36, part 5/W8.
8. Tsudikov M.B. Privedenie izobrazheniya iz kamery tipa «Rybii glaz» k standartnomu televizionnomu [Reduction of the image from the type chamber «Fish eye» to the standard television]. Izvestiya Tul'skogo Gosudarstvennogo Universiteta. Tekhnicheskie Nauki, 2011, no. 5–1, pp. 232–237.
9. Paniagua C., Puig L., Guerrero J.J. Omnidirectional structured light in a flexible configuration. Sensors (Basel, Switzerland), 2013, vol. 13, no. 10, pp. 13903–13916.
10. Mei C., Rives P. Single view point omnidirectional camera calibration from planar grids. Proc. IEEE Int. Conf. on Robotics and Automation, ICRA'07. Rome, Italy, 2007, art. 4209702, pp. 3945–3950. doi: 10.1109/ROBOT.2007.364084
11. Puig L., Bastanlar Y., Sturm P., Guerrero J.J., Barreto J. Calibration of central catadioptric cameras using a DLT-like approach. International Journal of Computer Vision, 2011, vol. 93, no. 1, pp. 101–114. doi: 10.1007/s11263-010-0411-1
12. Barreto J.P., Araujo H. Geometric properties of central catadioptric line images and their application in calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, vol. 27, no. 8, pp. 1327– 1333. doi: 10.1109/TPAMI.2005.163
13. Scaramuzza D., Martinelli A., Siegwart R. A flexible technique for accurate omnidirectional camera calibration and structure from motion. Proc. 4th IEEE Int. Conf. on Computer Vision Systems, ICVS'06. NY, USA, 2006, vol. 2006, art. 1578733, p. 45. doi: 10.1109/ICVS.2006.3
14. Scaramuzza D., Martinelli A., Siegwart R. A toolbox for easily calibrating omnidirectional cameras. Proc. IEEE Int. Conf. on Intelligent Robots and Systems, IROS 2006. Beijing, China, 2006, art. 4059340, pp. 5695– 5701. doi: 10.1109/IROS.2006.282372
15. Golushko M.N., Yaryshev S.N. Optiko-elektronnaya sistema nablyudeniya "Taifun" [Optoelectronic observing system "Typhoon"]. Voprosy Radioelektroniki. Seriya: Tekhnika Televideniya, 2014, no. 1, pp. 38–42.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License