Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Perceptual holistic color combination analysis of Papilionidae butterflies as aesthetic objects

Abstract

In this study, we clarified the holistic color combination rules of human-preferred Papilionidae butterflies by examining the hue, lightness, and chroma. A set of 118 Papilionidae butterfly images used in our previous study was analyzed. These images were classified via hierarchical density-based spatial clustering based on perceptual similarities of colors that were obtained from a subjective image classification experiment. The color combinations of the clustered images were determined based on representative colors that were analyzed by a Gaussian mixture model with minimum message length and the color combination types defined in our previous study. Consequently, we obtained the following holistic color combination rules for Papilionidae: 1) contrasting lightness, similar chroma, and similar hue, 2) contrasting lightness, contrasting chroma, and similar hue, 3) similar lightness, similar chroma, and complementary hue, and 4) similar lightness, similar chroma, and similar hue. These rules suggest that minority color harmony theories are valid under particular conditions.

Introduction

Color harmony studies contribute not only to developing color design but also to clarifying the mechanisms of human aesthetic responses. Conventional color harmony theories developed in the past were not always consistent with psychological results [14]. Thus, to achieve consistency between conventional color harmony theories and psychological results, the concept of computational aesthetics has had to be applied to color harmony studies, e.g., clarifying the color combination structures of paintings and butterflies as aesthetic objects [5, 6]. In a previous study, the common color combination rules in paintings were not clarified [5]. In contrast, our previous work [6] focused on the beautiful colors in nature that have been applied to color design [711], and the artistic quality of Papilionidae butterflies [1217]. Therein, the color combination rules of 118 human-preferred Papilionidae butterfly images were clarified [6], i.e., contrasting lightness, similar chroma, and similar hue, which agreed with a part of the psychological color harmony principles [14]. Nevertheless, similar lightness, contrasting chroma, and complementary hue have also been obtained as a minor proportion of results that agree with a part of the conventional color harmony principles [1, 18]. These results suggest that the minority color combinations that do not appear in the psychological results may harmonize limitedly. The color combinations were analyzed according to the color appearance attributes (i.e., lightness, chroma, and hue) independently and were considerably simplified in our previous work. To obtain more detailed color combination rules, the color combinations should be analyzed according to the integrated color appearance attributes. Moreover, in our previous work, the color combinations of classified images based on color similarities were analyzed [6]. Our previous work used a computational image classification method, which did not reflect the human visual perception accurately. To obtain more perceptual results, the color similarities should be measured by a human.

Accordingly, in this study, we aimed to clarify the holistic color combination rules of human-preferred Papilionidae butterflies according to the integrated lightness, chroma, and hue. Furthermore, to implement a more perceptual analysis, we applied color similarities based on human visual perception to a color combination analysis method.

Method

In this study, the method developed in our previous study was improved while preserving its framework [6]. Initially, a subjective image classification experiment was conducted to obtain the perceptual color similarities of images. Subsequently, these similarities were used as Hierarchical Density-Based Spatial Clustering of Applications with Noise (HDBSCAN) variables to classify the images. Then, the color distributions of the image clusters were segmented using the Gaussian mixture model (GMM) to extract representative colors of each cluster. Finally, the positional relations of the representative colors of a cluster on a color space were compared with the color combination types defined in our previous study to determine the color combination characteristics of the clusters.

This study was approved by the ethics committee of Chiba University.

Image classification experiment

To determine the perceptual similarity between each pair of images, we conducted a subjective image classification experiment. Experimental methods to measure the perceptual similarities of images include the following: (a) table scaling experiment, in which observers arrange the stimuli on the table according to their perceptual similarities; (b) computer scaling experiment, where a reference image is compared with some images, and the most similar one to the reference image is selected [19]; and (c) ViSiProG test, wherein some stimuli are presented simultaneously, and observers select and place the similar stimuli in a separate box to form the clusters [20]. To classify images of similar colors, as required in this study, a modified version of the ViSiProG test was designed.

We selected 118 Papilionidae butterfly images, which were used in our previous work as human-preferred Papilionidae butterflies (Fig 1) [6], as the stimuli. These images were centered on a white square-background (R, G, B = 1.0, 1.0, 1.0) covering an area of 12 cm2.

thumbnail
Fig 1. 118 images of butterflies of the Papilionidae family.

The corresponding code numbers are included with each image.

https://doi.org/10.1371/journal.pone.0240356.g001

As the experimental environment, two monitors (50 in and 32 in) connected to a MacBook Pro (Retina, 13-inch, Early 2015) (Apple, Inc., Cupertino, CA, USA) were placed in a darkened room (Fig 2). Additionally, a 50 in monitor (TH-50LFE7J, Panasonic, Corp., Osaka, Japan) was positioned to the right of a 32 in monitor (BL3201PT, BenQ, Inc., Taipei, Taiwan). An i1 Display Pro (X-Rite, Inc., Grand Rapids, MI, USA) was used to correct the colors of both monitors. On the 50 in monitor, thumbnail images of all stimuli were presented simultaneously against a black background (R, G, B = 0, 0, 0). The arrangement was in random order and the size was 5 cm2 (Fig 3(A)). Those images were enlarged to 12 cm2 upon double-clicking. On the 32 in monitor, the enlarged images and two folders were presented, and one folder was opened (Fig 3(B)). The folders had a black background (R, G, B = 0, 0, 0) and displayed no extra information (e.g., toolbar and sidebar). The left (32 in) monitor’s background was “Solid Gray Pro Ultra Dark” (R, G, B = 0.188, 0.188, 0.188) in Mac. The viewing distance between the monitor and the observer was 75 cm. Thirty-one observers with normal color vision from Chiba University participated in the experiment (16 males and 15 females, aged from their late teens to 30s).

thumbnail
Fig 3. Experiment window.

(a) Folder area on the left monitor. (b) Thumbnail images displayed on the right monitor.

https://doi.org/10.1371/journal.pone.0240356.g003

In the experiment, each observer took Ishihara’s test for color deficiency and was taught how to operate the experimental window. They were instructed to classify the stimuli based on only color, independent of shape and pattern. During the experiment, each observer dragged similarly colored images from the right monitor into the same folder on the left monitor. They could not include the same stimulus in the other folders, but could add new folders as necessary. This trial was repeated until all 118 stimuli were classified. The observers were briefed on the experiment and its safety, and provided written consent before beginning the experiment. For minor observers, consent from parents or guardians was not required by the ethics committee.

Image classification

To classify the 118 Papilionidae images, we calculated the similarities of the images based on the experimental results. First, to obtain the similarities, the frequencies at which a pair of images was classified into the same folders in the experiment were divided by the number of observers. The distance matrix was defined by the similarity of each pair of images subtracted from 1. The matrix consisted of 117 rows and columns, and each element of this matrix denotes the distance of each pair of images. Table 1 shows a part of the distance matrix as an example.

Using the distance matrix, HDBSCAN was performed with the dbscan package of R (The R Foundation, Vienna, Australia) to classify the images [21]. The minimum size of the clusters was two. HDBSCAN is a hierarchically modified density-based spatial clustering of applications with noise (DBSCAN) [22]. It can automatically determine the clusters while removing noise. Because HDBSCAN improve the hierarchical cluster analysis used in our previous work, it was used in this study.

Color combination characteristics

In our previous work, the representative colors of the classified images were extracted by a GMM with the mclust package of R (The R Foundation, Vienna, Australia) [23]. The GMM considers the color distributions as mixed multiple Gaussians and presents the probability density function of the color distributions in each Gaussian mixture component by using three variables, namely, mean values (μ), mixing proportions (πk, the estimated populations that configure a Gaussian), and covariance matrix (Σ), which were considered as the values, sizes, and ranges of a representative color, respectively. In this work, because the proportions under 3% were almost invisible [24], they were excluded from the analysis [6]. The number of components in each cluster was determined from the minimum message length criterion (MML) instead of the Bayesian information criterion (BIC), which was used as the reference value in our previous work [6]). The MML values were calculated for 20 components, because further components would complicate the results. In the MML, the shorter the code build for the data, the better the data generation models are [25]. GMM with MML outperformed those with other criterion, including BIC, in estimating the number of components [25]. Furthermore, it was effective in unsupervised color image segmentation with GMM and automatic estimation of the number of components [26]. Therefore, because GMM with MML improves upon the results obtained in our previous work using GMM with BIC, it was used in this study.

To determine the color combination characteristics of each cluster, we used the same color combination types as defined in our previous work (Table 2) [6]. The representative colors were classified into any one of the categories listed in Table 2 based on the mean values in the CIELCh color space (lightness, chroma, and hue scale in CIELAB) [27]. The color combinations of the clusters were determined from the color combination types in Table 2 based on the category combinations of representative colors.

thumbnail
Table 2. Defined color categories and color combination types in our previous work.

https://doi.org/10.1371/journal.pone.0240356.t002

Results

The selected images were classified into 24 clusters based on their perceptual similarity. Fig 4 depicts the dendrogram of the classified images and their cluster numbers. Table 3 lists the number of components (representative colors) and the best MML values in each cluster. Table 4 shows the μ, πk, and Σ values of each component in the clusters.

thumbnail
Fig 4. Simplified dendrogram showing 24 clusters of 118 images obtained by HDBSCAN.

The eps value shows the value of the epsilon neighborhood parameter. The width of the viatical line shows the number of points (images) in the cluster. Blue rectangles show each cluster. Gray dotted-rectangle shows the images removed as noise.

https://doi.org/10.1371/journal.pone.0240356.g004

thumbnail
Table 3. Number of components (“G”) in each cluster (“No.”) selected by GMM.

https://doi.org/10.1371/journal.pone.0240356.t003

thumbnail
Table 4. Proportions (“πk”), mean values (“μ”), and covariance matrices (“∑”) for each component (“No”).

https://doi.org/10.1371/journal.pone.0240356.t004

From an interpretation of Table 4 according to the definition of Table 2, the color combinations of the clusters tend to exhibit the following four patterns:

  1. contrasting lightness, similar chroma, and similar hue
  2. contrasting lightness, contrasting chroma, and similar hue
  3. similar lightness, similar chroma, and complementary hue
  4. similar lightness, similar chroma, and similar hue

Figs 57 depict the relative frequencies of CIELCh categories in each cluster of the color combination patterns 1–4. In Fig 5, 11 clusters were included in pattern 1. In terms of lightness (Fig 5(A)), low-category colors were more frequent than their high-category counterparts; there were a few mid-category colors. This result indicates the dominant low lightness and contrasting lightness combinations. In terms of chroma (Fig 5(B)), the low categories were the most frequent. There were a few middle categories, but no high categories. This indicates the presence of dominant low chroma and similar chroma combinations. In terms of hue (Fig 5(C)), the hues 2–4 occurred relatively more frequently; additionally, no further than the sixth hue was present. This indicates the dominant orange to yellow (the ranges of hue names in CIELCh were determined based on [28, 29]) and similar hue combinations. Therefore, these clusters mainly have contrasting lightness, similar chroma, and similar hue combinations.

thumbnail
Fig 5.

Relative frequencies of (a) L*, (b) C*ab, and (c) hab categories in color combination pattern 1.

https://doi.org/10.1371/journal.pone.0240356.g005

thumbnail
Fig 6.

Relative frequencies of (a) L*, (b) C*ab, and (c) hab categories in color combination pattern 2.

https://doi.org/10.1371/journal.pone.0240356.g006

thumbnail
Fig 7.

Relative frequencies of (a) L*, (b) C*ab, and (c) hab categories in color combination pattern 3.

https://doi.org/10.1371/journal.pone.0240356.g007

As shown in Fig 6, five clusters were included in pattern 2. In terms of lightness (Fig 6(A)), the same tendency as that in Fig 5(A) was observed. As for chroma (Fig 6(B)), the same tendency as that of lightness appeared. These results indicate the dominant low lightness (chroma) and contrasting lightness (chroma) combinations. In terms of hue (Fig 6(C)), the hues 2–4 occurred relatively more frequently, and no further than the fourth hue appeared. This indicates the dominant orange to yellow and similar hue combinations. Therefore, these clusters mainly have contrasting lightness, contrasting chroma, and similar hue combinations.

As shown in Fig 7, three clusters were included in pattern 3. In terms of lightness (Fig 7(A)), the low categories were the most frequent. There were few middle categories, and no high categories. In terms of chroma (Fig 7(B)), the same tendency as that of lightness appeared. These indicate the dominant low lightness (chroma) and similar lightness (chroma) combinations. In terms of hue (Fig 7(C)), the hues 2–4 occurred relatively more frequently, and there were a few ninth and tenth hues. These indicate the dominant orange and complementary hue combinations. Thus, these clusters mainly have similar lightness, similar chroma, and complementary hue combinations.

In Fig 8, two clusters were included in pattern 4. In terms of lightness and chroma (Fig 8(A) and 8(B)), the same tendencies as those in Fig 7(A) and 7(B) appeared, respectively. These indicate the dominant low lightness (chroma) and similar lightness (chroma) combinations. In terms of hue (Fig 8(C)), the same tendency as that in Fig 6(C) appeared. These indicate the dominant orange and similar hue combinations. Accordingly, these clusters mainly have similar lightness, similar chroma, and similar hue combinations.

thumbnail
Fig 8.

Relative frequencies of (a) L*, (b) C*ab, and (c) hab categories in color combination pattern 4.

https://doi.org/10.1371/journal.pone.0240356.g008

The above four patterns did not appear in clusters 9, 11, and 16. In cluster 9 (Table 4(i)), the middle lightness and chroma were dominant. The hues 2–4 were frequent. Thus, cluster 9 only has similar hue combinations. In cluster 11 (Table 4(k)), the middle and high lightness were frequent. The low chroma was most frequent. There were a few middle chroma and no high chroma. The hues 2–4 were frequent. Thus, cluster 11 mainly had similar chroma and similar hue combinations. In cluster 16 (Table 4(p)), the low and high lightness were frequent, along with the low and middle chroma. Further, the hues 2–4 were frequent. Therefore, cluster 16 mainly has contrasting lightness and similar hue combinations.

In addition, these color combination patterns are not always consistent with the characteristics of the dendrogram in Fig 4. In this dendrogram, the higher the similarity of colors of images, the greater the belongingness of the corresponding images to the adjacent branches. However, a few clusters that have the same color combination pattern are found farther from each other. Further, in Fig 4, the conspicuous chromatic colors of the clusters that belong to the same branch are similar. In fact, clusters 7 and 8, which belong to the same branch, consist of images of yellow and yellow-green butterflies. Similarly, clusters 1 and 2 consist of images of blue-green butterflies; clusters 12–20 consist of images of yellow butterflies. (Note that the relative areas of the conspicuous chromatic colors are different.) Therefore, the results of the image clustering in this study could be influenced by the conspicuous chromatic colors as well.

Discussion and limitations

We obtained the holistic color combination rules of human-preferred Papilionidae butterflies for 4 different categories, as mentioned previously. In the previous psychological color harmony studies, the following robust color harmony principles were invariably obtained: “High lightness,” “Unequal lightness values” (large lightness difference), “Equal chroma” (same or similar in chroma color), and “Equal hue” (same or similar hue color) [14]. The above three principles, except “High lightness,” qualitatively agreed with the results of our previous work, i.e., contrasting lightness, similar chroma, and similar hue [6]. In addition to our previous work, the color combination pattern 1 is also consistent with the above three principles (e.g., “Unequal lightness values,” “Equal chroma,” and “Equal hue”) in this study. As they appeared most frequently in the clusters, the contrasting lightness, similar chroma, and similar hue are the most dominant color combinations of human-preferred Papilionidae butterflies.

Similar lightness, contrasting chroma, and complementary hue differ from the above principles; however, they appeared in the color combination patterns 2–4 and in the minority of results in our previous work [6]. The similar lightness and complementary hue qualitatively agree with the “equal lightness” and “complementary hue” that are part of the conventional color harmony principles [1]. The contrasting chroma appears in the results of Chuang and Ou as well, and their results exhibited large 95% confidence intervals [18]. It is likely that these color combinations did not appear because the color appearance attributes were handled independently and those results were simplified in previous psychological color harmony studies. In contrast, this study investigated the results by integrating color appearance attributes. Therefore, these color combinations may not be universal, but may harmonize to a limited extent. The conditions for this are as follows. The similar lightness harmonizes by combining similar chroma. The contrasting chroma harmonizes by combining contrasting lightness and similar hue. The complementary hue harmonizes by combining similar lightness and similar chroma.

Furthermore, this study has several limitations. Initially, the conditions that harmonize similar lightness, contrasting chroma, and complementary hue were shown. As these color combination rules were obtained from the human-preferred Papilionidae butterflies for which the color harmony was demonstrated in our previous work, these rules are valid. However, we cannot conclude these results for color harmony theories based on limited samples. Psychological experiments to investigate the harmonies of the color combinations under those conditions are required in future work.

Subsequently, the color combination types were defined based on the ranges of segmented color space. These definitions of similar and contrasting color combinations may not always agree with perception. To achieve a more perceptual color combination analysis, the threshold of similarity and contrast must be determined experimentally in future work.

Moreover, it is suggested that the results of image clustering based on the human visual perception in this study were influenced by the conspicuous chromatic colors. Therefore, in the future work, the conspicuous chromatic colors should also be included in the color combination analysis.

Similarly, the concepts of the methods of our previous study and this study must be regulated, to compare the results. In our previous study, we employed a simple and conventional approach to obtain standard results (i.e., data comparable with the results of the future works) because the color combination analysis method has not been established until now. On the other hand, because a fuzzy logic is applied to color image segmentation and color planning system based on the image [8, 3032], the same can be applied to color combination analysis in future works as well.

Finally, the method used in this study can be improved to enable application to the color combination analysis of other aesthetic objects (e.g., flowers, jewelries, etc.). We will analyze and clarify the color combination rules of other aesthetic objects in our future works. If the aesthetic objects reflect the human psychological aesthetic responses, the color combination rules of other aesthetic objects may agree with the results of this study and previous color harmony studies. Moreover, the color combination rules peculiar to other aesthetic objects could also be shown.

Conclusion

In this study, we aimed to clarify the perceptual holistic color combination rules of human-preferred Papilionidae butterflies. To achieve this, the Papilionidae butterfly images were classified via hierarchical density-based spatial clustering based on experimentally obtained perceptual color similarities. The color combinations of the clustered images were determined based on representative colors extracted by the GMM with minimum message length. We obtained the following holistic color combination rules of Papilionidae:

  1. contrasting lightness, similar chroma, and similar hue
  2. contrasting lightness, contrasting chroma, and similar hue
  3. similar lightness, similar chroma, and complementary hue
  4. similar lightness, similar chroma, and similar hue

The first rule agrees with the results of our previous work and some of the most robust harmony principles of psychological studies. The other rules suggest that similar lightness, contrasting chroma, and complementary hue harmonize to a limited extent. Future studies will focus on the following issues:

  • Experimental verification of the harmonies of the above color combination rules, except the first rule
  • Color combination analysis of other aesthetic objects
  • Further perceptual analysis based on the conspicuous chromatic colors and a fuzzy logic
  • Experimental clarification of the threshold for similar and contrasting color combinations

References

  1. 1. Ou L, Chong P, Luo MR, Minchew C. Additivity of colour harmony. Color Res. Appl. 2011;36(5):355–372.
  2. 2. Ou L, Luo MR. A colour harmony model for two-colour combinations. Color Res. Appl. 2006;31(3):191–204.
  3. 3. Ou L, Yuan Y, Sato T, et al. Universal models of colour emotion and colour harmony. Color Res. Appl. 2018;43(5):736–748.
  4. 4. Szabó F, Bodrogi P, Schanda J. Experimental modeling of colour harmony. Color Res. Appl. 2010;35(1):34–49.
  5. 5. Kobayasi M. Analysis of color combination in fine art paintings. Proc. Int. Symp. Multispectral Imaging Color Reproduction. 1999;139–142.
  6. 6. Kakehashi E, Muramatsu K, Hibino H. Computational color combination analysis of Papilionidae butterflies as aesthetic objects. Color Res. Appl. 2019.
  7. 7. Kinoshita S, Yoshioka S. Structural colors in nature: The role of regularity and irregularity in the structure. ChemPhysChem. 2005;6(8):1442–1459. pmid:16015669
  8. 8. Hsiao S. Tsai C. Transforming the natural colors of an image into product design: A computer-aided color planning system based on fuzzy pattern recognition. Color Res. Appl. 2015;40(6):612–625.
  9. 9. Design Seeds. "For all who love color". 2009. available at: https://www.design-seeds.com/ (Accessed, 5 June 2018)
  10. 10. Matsue Y. Nachuraru hamoni to konpurekkusu hamoni no haishoku hyoka no chigai -The differences of the evaluations between natural-harmony and complex-harmony- (in Japanese). Official Journal of Basic Design and Art. 1997;6:51–56.
  11. 11. Adachi Y, Ohyama M, Kusaki M, Takano Y. Harmonious feeling of colors appearing on plants (First Report): Comparison with colors recorded in JIS Z 8721 (in Japanese). Journal of the Color Science Association of Japan. 2008;32(2):85–93.
  12. 12. Gabriele B, Schlegel B, Kauf P, Rupf R. The importance of being colorful and able to fly: Interpretation and implications of children's statements on selected insects and other invertebrates. Int. J. Sci. Edu. 2015;37(16):2664–2687.
  13. 13. Shipley N, Bixler R. Beautiful bugs, bothersome bugs, and fun bugs: Examining human interactions with insects and other arthropods. Anthrozoös. 2017;30(3):357–372.
  14. 14. Beldade P, Brakefield P. The genetics and evo-devo of butterfly wing patterns. Nature Reviews Genetics. 2002;3:442–452. pmid:12042771
  15. 15. Matsuka M, Kuribayashi S, Umetani K. Ajia no konchu shigen -insect resources in Asia- (in Japanese). Japan International Research Center for Agricultural Sciences. 1998.
  16. 16. Watanabe K. The patterns of butterflies: the relevance with image of butterflies in literature (in Japanese). Costume and Textile. 2008;9(1):81–91.
  17. 17. Layberry R,‎ Hall P, Lafontaine D. The butterflies of Canada. University of Toronto Press 77. 1998.
  18. 18. Chuang MC, Ou L. Influence of a holistic color interval on color harmony. Color Res. Appl. 2001;26(1):29–39.
  19. 19. Rogowitz BE, Frese T, Smith JR, Bouman CA, Kalin EB, Pappas TN. Perceptual image similarity experiments. Human Vision and Electronic Imaging III. 1998;Proc. SPIE 3299:576–590/
  20. 20. Zujovic J, Pappas TN, Neuhoff DL, van ER, de RHuib. Effective and efficient subjective testing of texture similarity metrics. Journal of the Optical Society of America A. 2015;32(2). pmid:26366606
  21. 21. dbscan.pdf. available at: https://cran.r-project.org/web/packages/dbscan/dbscan.pdf (Accessed, 6 December 2019)
  22. 22. Campello RJ.G.B., Moulavi D, Sander J. Density-based clustering based on hierarchical density estimates. Lecture Notes in Computer Science. 2013;7819 LNAI:160–172.
  23. 23. Mclust.pdf. available at: https://cran.r-project.org/web/packages/mclust/mclust.pdf (Accessed, 6 December 2019)
  24. 24. Lišková S, Frynta D. What Determines Bird Beauty in Human Eyes? Anthrozoös. 2013;26(1):27–41
  25. 25. Figueiredo MA.T., Jain AK. Unsupervised learning of finite mixture models. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2002;24(3):381–396.
  26. 26. Wu Y, Yang X, Chan K. Unsupervised color image segmentation based on Gaussian mixture model. Proceedings of the 2003 Joint Conference of the 4th International Conference on Information, Communications and Signal Processing and 4th Pacific-Rim Conference on Multimedia. 2003;1:541–544.
  27. 27. Commission Internationale de L’E´ clairage (CIE). Technical Report 15:2004: Colorimetry, 3rd ed. Vienna: CIE Central Bureau; 2004.
  28. 28. Blanchard C E. and Haadsma R D. Harmonizing color selection system and method. US patent 6,870,544; 2005.
  29. 29. Jonauskaite D, Mohr C, Antonietti J, et al. Most and least preferred colours differ according to object context: new insights from an unrestricted colour range. PLoS One. 2016;11(3):e0152194. pmid:27022909
  30. 30. Konstantinidis K, Gasteratos A, Andreadis I. Image retrieval based on fuzzy color histogram processing. Optics Communications. 2005;248(4–6):375–386.
  31. 31. Siang TK, Mat IN. Color image segmentation using histogram thresholding Fuzzy C-means hybrid approach. Pattern Recognition. 2011;44(1):1–15.
  32. 32. Cheng HD, Jiang XH, Sun Y, Wang J. Color image segmentation: Advances and prospects. Pattern Recognition. 2001;34(12):2259–2281.