Skip to main content
Log in

Automatic gender recognition based on pixel-pattern-based texture feature

  • Original Research Paper
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

A pixel-pattern-based texture feature (PPBTF) is proposed for real-time gender recognition. A gray-scale image is transformed into a pattern map where edges and lines are to be used for characterizing the texture information. On the basis of the pattern map, a feature vector is comprised the numbers of the pixels belonging to each pattern. We use the image basis functions obtained by principal component analysis (PCA) as the templates for pattern matching. The characteristics of the feature are comprehensively analyzed through an application to gender recognition. Adaboost is used to select the most discriminative feature subset, and support vector machine (SVMs) is adopted for classification. Performed on frontal images from FERET database, the comparisons with Gabor show that PPBTF is a significant facial representation, quite effective and speedier in computation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig.6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Golomb B., Lawrence D., Sejnowski T.: Sexnet: a neural network identifies sex from human faces. In: Advances in Neural Information Processing Systems, pp. 572–577. Morgan Kaufmann, San Mateo (1991)

    Google Scholar 

  2. Tamura S.H., Kawai, Mitsumoto H.: Male/female identification from 8 × 6 very low resolution face images by neural network. Pattern Recognit. 29, 331–335 (1996)

    Article  Google Scholar 

  3. Gutta, S., Weschler, H., Phillips, P.J.: Gender and ethnic classification of human faces using hybrid classifiers. In: Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition, pp. 194–199 (1998)

  4. Wu B., Ai H., Huang C.: LUT-based Adaboost for gender classification. AVBPA 2688, 104–110 (2003)

    Google Scholar 

  5. Moghaddam B., Yang M.H.: Gender classification with support vector machines. In: IEEE Trans. PAMI 24, 707–711 (2002)

    Google Scholar 

  6. Baluja S., Rowley H.A.: Boosting sex identification performance. Int. J. Comput. Vis. 71, 111–119 (2007)

    Article  Google Scholar 

  7. Jain, A., Huang, J., Fang, S.: Gender identification using frontal facial images. In: Proceedings of IEEE Conference on Multimedia and Expo, pp. 1082–1085 (2005)

  8. Burton A., Bruce V., Dench N.: What’s the difference between men and women? Evidence from facial measurements. Perception 22, 153–176 (1993)

    Article  Google Scholar 

  9. Brunelli, R., Poggio, T.: Hyberbf networks for gender classification. In: DARPA Image understanding Workshop, pp. 311–314 (1992)

  10. Sun, Z., Bebis, G., Yuan, X., Louis, S.J.: Genetic feature subset selection for gender classification: a comparison study. In: 6th IEEE Workshop on Applications of Computer Vision, pp. 165–170. IEEE Computer Society, Orlando (2002)

  11. Walavalkar L., Yeasin M., Arasinmhmurthy A., Sharma R.: Support vector learning for gender classification using audio visual cues: a comparison. Pattern Recognit Artif Intell. 17, 417–439 (2003)

    Article  Google Scholar 

  12. Zeng X.Y., Chen Y.W., Nakao Z., Lu H.: Texture representations based on pattern maps. Signal Process. 84, 589–599 (2004)

    Article  MATH  Google Scholar 

  13. Tian, Y.: Evaluation of face resolution for expression analysis. In: IEEE Workshop on Face Processing in Video (2004)

  14. Hancock P.J.B., Baddeley R.J., Smith L.S.: The principal components of natural images. Network 3, 61–70 (1992)

    Article  Google Scholar 

  15. Bell A.J., Sejnowski T.J.: The ‘independent components’of natural scenes are edge filters. Vision Res. 37, 3327–3338 (1997)

    Article  Google Scholar 

  16. Turk M., Pentland A.: Eigenfaces for recognition. J. Cogn. Neurosci. 3, 71–86 (1991)

    Article  Google Scholar 

  17. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of IEEE conference on Computer Vision and Pattern Recognition, pp. 511–518 (2001)

  18. Littlewort G., Bartlett M.S., Fasel I., Movellan J.R.: Real time face detection and facial expression recognition: development and applications to human computer interaction. CVPRW 5, 53 (2003)

    Google Scholar 

  19. Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 46–53 (2000)

  20. Lu, H., Wu, P., Lin, H., Yang, D.: Automatic facial expression recognition. In: IEEE International Symposium on Neural Network, pp. 63–68 (2006)

  21. Sim, T., Baker, S., Bsat, M.: The CMU pose, illumination, and expression (PIE) database of human faces. In: Report CMU-RI-TR-01–02, Robotics Institute, Carnegie Mellon University (2001)

Download references

Acknowledgments

Portions of the research in this article use the FERET database of facial images collected under the FERET program.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huchuan Lu.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lu, H., Huang, Y., Chen, Y. et al. Automatic gender recognition based on pixel-pattern-based texture feature. J Real-Time Image Proc 3, 109–116 (2008). https://doi.org/10.1007/s11554-008-0072-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-008-0072-2

Keywords

Navigation