APPLICATION OF THE LONGITUDINAL CHROMATIC ABERRATION EFFECT FOR DISTANCES MEASUREMENT ON THE BASIS OF A SINGLE PHOTO
Read the full article ';
For citation: Volkova M.A., Lutsiv V.R. Application of the longitudinal chromatic aberration effect for distances measurement on the basis of a single photo. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2016, vol. 16, no. 2, pp. 251–257. doi:10.17586/2226-1494-2016-16-2--251-257
Subject of Research. We propose a method for measuring the distances to the surfaces of photographed objects based on the effect of longitudinal chromatic aberration. According to this effect, the focal length of lens depends on the wavelength of refracted light, thus the defocus of image formed by lens depends not only on the distance from image plane to lens, but also on the wavelength of light the picture was taken for (red, green or blue color ranges). Method. The proposed method of distance measurement is based on comparison of image defocus for different wavelengths (e.g., red and blue). The comparison is performed in the spatial frequency domain and is based on analysis of image complex spectrograms (the Fourier spectrum calculated locally within a window scanning the image).The distance to each point of photographed surface is calculated in closed form by the analysis of local spatial spectrum with the use of Gaussian model of point spread function. Main Results. The working capacity of distance measurement technique is partially verified on the basis of image defocus imitation by different displacement of objective lens with respect to sensor matrix of camera. The presented analysis of chromatic parameters of traditionally applied optical materials also proves the physical realizability of proposed technique. A technology is also proposed for distance measuring based on really differing image defocus in different color channels applying a special calibration of electro-optical system. Practical Relevance. The proposed distance measurement technique could be practically useful in the cases where any active illumination of objects being photographed is prohibited. It is also worth to be applied in the inexpensive low-sized optical measuring devices like Kinect for Xbox-360.
Acknowledgements. The work was supported by the Ministry of Education and Science, and partially by the Government of the Russian Federation (grant 074-U01).
1. Krizhevsky A., Sutskever I., Hinton G.E. ImageNet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, 2012, vol. 2, pp. 1097–1105.
2. Szeliski R. Computer Vision: Algorithms and Applications. Springer, 2011, 812 p. doi: 10.1007/978-1-84882-935-0
3. Kamencay P., Breznan M., Jarina R., Lukac P., Zachariasova M. Improved depth map estimation from stereo images based on hybrid method. Radioengineering, 2012, vol. 21, no. 1, pp. 79–85.
4. Chaudhuri S., Rajagopalan A.N. Depth from Defocus: a Real Aperture Imaging Approach. Springer, 1999, 172 p. doi: 10.1007/978-1-4612-1490-8
5. Horn B.K.P. Obtaining shape from shading information. In The Psychology of Computer Vision (P.H. Winston, Ed.). McGraw-Hill, 1975.
6. Marr D. Vision. A Computational Investigation into the Human Representation and Processing of Visual Information. NY, W.H. Freeman and Company, 1982, 415 p.
7. Pentland A. A new sense for depth of field. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1987, vol. PAMI-9, no. 4, pp. 523–531. doi: 10.1109/TPAMI.1987.4767940
8. Grossmann P. Depth from focus. Pattern Recognition Letters, 1987, vol. 5, no. 1, pp. 63–69. doi: 10.1016/0167-8655(87)90026-2
9. Zhuo S., Sim T. On the recovery of depth from a single defocused image. Lecture Notes in Computer Science, 2009, vol. 5702, pp. 889–897. doi: 10.1007/978-3-642-03767-2_108
10. Tang C., Hou C., Song Z. Defocus map estimation from a single image via spectrum contrast. Optics Letters, 2013, vol. 38, no. 10, pp. 1706–1708. doi: 10.1364/OL.38.001706
11. Saxena A., Chung S.H., Ng A.Y. 3-D depth reconstruction from a single still image. International Journal of Computer Vision, 2008, vol. 76, no. 1, pp. 53–69. doi: 10.1007/s11263-007-0071-y
12. Eigen D., Puhrsch C., Fergus R. Depth map prediction from a single image using a multi-scale deep network. Advances in Neural Information Processing Systems, NIPS 2014. Montreal, Canada, 2014, pp. 2366–2374.
13. Begunov B.N., Zakaznov N.P. Teoriya Opticheskikh Sistem [Theory of Optical Systems]. Moscow, Mashinostroenie Publ., 1973, 488 p.
14. Churilovskii V.N. Teoriya Opticheskikh Priborov [Theory of Optical Devices]. Leningrad, Mashinostroenie Publ., 1966, 565 p.
15. Tisse C.-L., Nguyen H. P., Tessieres R., Pyanet M., Guichard F. Extended depth-of-field using sharpness transport across color channels. Proc. SPIE, 2008, vol. 7061, art. 706105. doi: 10.1117/12.793826
16. Walker B. Optical Engineering Fundamentals. 2nd ed. Bellingham, WA, SPIE Press, 2008, 277 p.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License