3D Model-Based Semantic Categorization of Still Image 2D Objects

3D Model-Based Semantic Categorization of Still Image 2D Objects

Raluca-Diana Petre, Titus Zaharia
Copyright: © 2011 |Volume: 2 |Issue: 4 |Pages: 19
ISSN: 1947-8534|EISSN: 1947-8542|EISBN13: 9781613508541|DOI: 10.4018/jmdem.2011100102
Cite Article Cite Article

MLA

Petre, Raluca-Diana, and Titus Zaharia. "3D Model-Based Semantic Categorization of Still Image 2D Objects." IJMDEM vol.2, no.4 2011: pp.19-37. http://doi.org/10.4018/jmdem.2011100102

APA

Petre, R. & Zaharia, T. (2011). 3D Model-Based Semantic Categorization of Still Image 2D Objects. International Journal of Multimedia Data Engineering and Management (IJMDEM), 2(4), 19-37. http://doi.org/10.4018/jmdem.2011100102

Chicago

Petre, Raluca-Diana, and Titus Zaharia. "3D Model-Based Semantic Categorization of Still Image 2D Objects," International Journal of Multimedia Data Engineering and Management (IJMDEM) 2, no.4: 19-37. http://doi.org/10.4018/jmdem.2011100102

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

Automatic classification and interpretation of objects present in 2D images is a key issue for various computer vision applications. In particular, when considering image/video, indexing, and retrieval applications, automatically labeling in a semantically pertinent manner/huge multimedia databases still remains a challenge. This paper examines the issue of still image object categorization. The objective is to associate semantic labels to the 2D objects present in natural images. The principle of the proposed approach consists of exploiting categorized 3D model repositories to identify unknown 2D objects, based on 2D/3D matching techniques. The authors use 2D/3D shape indexing methods, where 3D models are described through a set of 2D views. Experimental results, carried out on both MPEG-7 and Princeton 3D models databases, show recognition rates of up to 89.2%.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.