Abstract
We propose a texture synthesis method that controls the desired visual impressions using CNN style features and content features. Diversifying user needs has led to the personalization of products according to individual needs. In the custom made garment service, users can select and combine fabrics, patterns, and shapes of garments prepared in advance to design garments that meet their tastes and preferences. Furthermore, controlling the visual impressions will enable the service to provide designs that better match the user’s preferences. In image synthesis, controllable texture synthesis have been performed with style and content, however, few previous studies have controlled images based on impressions (including aesthetics). In this study, we aim to synthesize textures with desired visual impressions by using style and content features. For this purpose, we first (1) quantify the affective texture by subjective evaluation experiments and (2) extract style features and content features using VGG-19 from pattern images for which evaluation scores are assigned. The explanatory variables are style and content features, and the objective variables are evaluation scores. We construct an impression estimation model using Lasso regression for each of them. Next, (3) based on impression estimation models, we control the visual impressions and synthesize textures. In (2), we constructed highly accurate visual impression estimation models using style and content features. In (3), we obtained synthesis results that match human intuition.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Julesz, B.: Textons, the elements of texture perception, and their interactions. Nature 290(5802), 91–97 (1981)
Portilla, J., Simoncelli, E.P.: A parametric texture model based on joint statistics of complex wavelet coefficients. Int. J. Comput. Vision 40(1), 49–70 (2000)
Tobitani, K., Shiraiwa, A., Katahira, K., Nagata, N., Nikata, K., Arakawa, K.: Modeling of “high-class feeling” on a cosmetic package design. J. Jpn. Soc. Precis. Eng. 87(1), 134–139 (2021)
Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition (2014). arXiv preprint arXiv:1409.1556
Gatys, L.A., Ecker, A.S., Bethge, M.: Image style transfer using convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2414–2423 (2016)
Wang, P., Li, Y., Vasconcelos, N.: Rethinking and improving the robustness of image style transfer. In: Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 124–133. IEEE, Nashville (2021)
Yu, N., Barnes, C., Shechtman, E., Amirghodsi, S., Lukac, M.: Texture mixer: a network for controllable synthesis and interpolation of texture. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 12164–12173 (2019)
Li, Y., Fang, C., Yang, J., Wang, Z., Lu, X., Yang, M.: Diversified texture synthesis with feed-forward networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3920–3928 (2017)
Yang, S., Wang, Z., Wang, Z., Xu, N., Liu, J., Guo, Z.: Controllable artistic text style transfer via shape-matching GAN. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pp. 4442–4451 (2019)
Chen, H., et al.: DualAST: dual style-learning networks for artistic style transfer. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 872–881 (2021)
Rombach, R., Blattmann, A., Lorenz, D., Esser, P., Ommer, B.: High-resolution image synthesis with latent diffusion models. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 10684–10695 (2022)
Tobitani, K., Matsumoto, T., Tani, Y., Fujii, H., Nagata, N.: Modeling of the relation between impression and physical characteristics on representation of skin surface quality. J. Inst. Image Inf. Telev. Eng. 71(11), 259–268 (2017)
Mori, T., Uchida, Y., Komiyama, J.: Relationship between visual impressions and image information parameters of color textures. J. Jpn. Res. Assoc. Text. End-uses 51(5), 433–440 (2010)
Sunda, N., Tobitani, K., Tani, I., Tani, Y., Nagata, N., Morita, N.: Impression estimation model for clothing patterns using neural style features. In: Proceedings of the Springer International Conference on Human-Computer Interaction, pp. 689–697 (2020)
Takemoto, A., Tobitani, K., Tani, Y., Fujiwara, T., Yamazaki, Y., Nagata, N.: Texture synthesis with desired visual impressions using deep correlation feature. In: 2019 IEEE International Conference on Consumer Electronics (ICCE), pp. 1–2. IEEE, Las Vegas (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Sugiyama, Y., Sunda, N., Tobitani, K., Nagata, N. (2023). Texture Synthesis Based on Aesthetic Texture Perception Using CNN Style and Content Features. In: Na, I., Irie, G. (eds) Frontiers of Computer Vision. IW-FCV 2023. Communications in Computer and Information Science, vol 1857. Springer, Singapore. https://doi.org/10.1007/978-981-99-4914-4_9
Download citation
DOI: https://doi.org/10.1007/978-981-99-4914-4_9
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4913-7
Online ISBN: 978-981-99-4914-4
eBook Packages: Computer ScienceComputer Science (R0)