UDK УДК 004.932 Vestnik SibGAU. 2014, No. 3(55), P. 120–125
AUTOMATIC LANDSCAPE IMAGE ANNOTATION
A. V. Proskurin
Siberian State Aerospace University named after academician M. F. Reshetnev 31, Krasnoyarsky Rabochy Av., Krasnoyarsk, 660014, Russian Federation Е-mail: Proskurin.AV.WOF@gmail.com
The image retrieval in the Internet and specialized datasets is the important task. For such retrieval is expedient to apply the systems of automatic image annotation (AIA) based on low-level features. Due to wide variety of images, it's sometimes useful to categorize images and to customize methods of AIA according these categories. In this article, the automatic landscape image annotation (ALIA) is discussed. Natural objects (rocks, clouds and etc.) on the landscape images often include just one texture. Because of that, for ALIA enough use of the machine translation model. In this model, the process of image annotation is analogous to the translation of one form of representation (image regions) to another form (keywords). Firstly, a segmentation algorithm is used to segment images into object-shaped regions. Then, cauterization is applied to the feature descriptors that are extracted from all the regions, to build visual words (clusters of visually similar image regions). Finally, a machine translation model is applied to build a translation table contain-ing the probability estimations of the translation between image regions and keywords. An unseen image is annotated by choosing the most likely word for each of its regions. The ALIA algorithm was developed using machine translation model, wherein on the step of segmentation is applied algorithm of color-texture segmentation JSEG. Additionally, be-fore segmentation the image is reduced in order to prevent appearance of small regions and to reduce computational cost, and after segmentation the resulting segmentation map is increased to the size of original image. The region de-scriptor including second-order statistical features and fractal features was proposed to describe the received seg-ments. The extracted feature vectors are clustered using algorithm k-means. The proposed algorithm annotates the landscape images with 88 % precision and can be applied to annotate the images from specialized image datasets and the Internet.
landscape images, automatic annotation, algorithm JSEG, texture features.
References

1. Zhang D., Islam Md. M., Lu G. A review on automatic image annotation techniques. Pattern Recognition. 2012. Vol. 45(1). P. 346–362.

2. Duygulu P., Barnard K., Freitas N., Forsyth D. Object recognition as machine translation. In The Seventh European Conference on Computer Vision. Part IV, 2002, р. 97–112.

3. Dey V., Zhang Y., Zhong M. A review on image segmentation techniques with remote sensing perspective. Proceedings of the International Society for Photogrammetry and Remote Sensing Symposium (ISPRS10).
Vol. XXXVIII, Part 7A, 2010, р. 31–42.

4. Deng Y., Manjunath B. S. Unsupervised Segmentation of Color-Texture Regions in Images and Video. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2001, Vol. 23(8), p. 800–810.

5. Fisenko V. T., Fisenko T. Yu. Komp'yuternaya obrabotka i raspoznavanie izobrazheniy [Computer image processing and recognition]. St. Petersburg, ITMO Publ., 2008, p. 195.

6. Haralick R. M., Shanmugam K., Dinstein I. H. Textural Features for Image Classification. IEEE Trans. on Systems, Man and Cybernetics. 1973, Vol. 3(6), p. 610–621.

7. Favorskaya M. N., Petukhov N. Y. Recognition of natural objects on air photographs using neural networks. Optoelectronics, Instrumentation and Data Processing. 2011, Vol. 47(3), p. 233–238.

8. Gonzalez R. C., Woods R. E. Digital image processing (3rd edition). Prentice-Hall, Inc. Upper Saddle River, NJ, USA, 2006, p. 976.

9. IAPR TC-12 Benchmark. Available at: http:// www-i6.informatik.rwth-aachen.de/imageclef/ resources/ iaprtc12.tgz (accessed 20.11.2013).


Proskurin Alexander Victorovich – postgraduate student, Siberian State Aerospace University named after academician M. F. Reshetnev. E-mail: Proskurin.AV.WOF@gmail.com.