A New Small-Size Camera with Built-In Specific-Wavelength LED Lighting for Evaluating Chlorophyll Status of Fruit Trees

To produce high-quality crops, not only excellent cultivation techniques but also accurate nutrient management techniques are important. In recent years, many nondestructive tools such as the chlorophyll meter “SPAD” and the leaf nitrogen meter “Agri Expert CCN” have been developed for measuring crop leaf chlorophyll and nitrogen contents. However, such devices are still relatively expensive for individual farmers. In this research, we developed a low-cost and small-size camera with built-in LEDs of several specific wavelengths for evaluating the nutrient status of fruit trees. A total of 2 camera prototypes were developed by integrating 3 independent LEDs of specific wavelengths (Camera 1: 950 nm, 660 nm and 560 nm; Camera 2: 950 nm, 660 nm and 727 nm) into the device. In addition, a simple software tool was developed to enable the camera to capture leaf images under different LED lighting conditions. Using the prototypes, we acquired images of apple leaves and investigated the possibility of using the images to estimate the leaf nutrient status indicator SPAD (chlorophyll) and CCN (nitrogen) values obtained using the above-mentioned standard tools. The results indicate that the Camera 1 prototype is superior to the Camera 2 prototype and can potentially be applied to the evaluation of nutrient status in apple leaves.


Introduction
To produce high-quality apples to meet the rapidly growing market demand, key cultivation techniques such as artificial pollination, pruning of branches, thinning of flowers and fruits and bagging or rotating on-tree fruits have been adopted and widely practiced in Japan [1,2]. In recent decades, in addition to these meticulous cultivation techniques, optimal fertilization and nutrient management have attracted considerable attention in fruit production [3,4]. Precision nutrient management strategies have been proposed to replace conventional fertilizer practices for apple orchards [5,6]. Conventional fertilizing practices apply the same amount of fertilizer to the orchard regardless of the in-field variations in soil fertility and crop nutrient status, which raises the risk of overuse or underuse in different areas, resulting in less yield and fruit quality. In comparison, precision nutrient management strategies are characterized by site-specific and need-based nutrient management practices, which necessitate a thorough understanding of the nutrient status of the orchard on an individual tree basis [7,8]. It is therefore important to timely and accurately evaluate the nutrient status of individual trees in different growth stages when precision nutrient management strategies are implemented in orchards [9,10].
Chlorophyll is not only a pigment that gives the plant its green color, but is also an essential component for the conversion of light energy to stored chemical energy in the process of photosynthesis, which sustains plant life and promotes its growth. The chlorophyll pigment content in a leaf is directly related to the amount of solar radiation absorbed by the leaf, and eventually determines the photosynthetic capacity of the plant [11,12]. On The experiments were carried out in the Fujisaki Farm of Hirosaki University, Japan (longitude 140 • 48 ′ 48.21″ E and latitude 40 • 65 ′ 61.6″ N). The most cultivated 'Fuji' apple trees (Malus domestica) grafted on the rootstock cultivar of 'Marubakaidou' (Malus prunifolia Borkh. Var. ringo Asami) at age 25 years were used in this study. The trees were planted at spacings of approximately 5 × 3.5 m and managed by the input of organic fertilizers made from apple leaves collected from the orchard. Apple leaves were collected from multiple trees in summer season (14 July 2020) and were screened to ensure a healthy, complete and undamaged status in the laboratory. In modeling analysis, because it is desirable to have a dataset with a wide distribution (variance) in the target variable (chlorophyll content), all of the leaf samples collected were first grouped via visual inspection of the leaf color, and then the leaf samples used for measurement were randomly selected from the groups. Therefore, the selected leaves exhibited large variation in degree of greenness. In the experiment, a total of 120 samples were prepared and measured within 4 h after sampling.
An ImSpector V10E (Spectral Imaging Ltd., Oulu, Finland)-based hyperspectral imaging system [10] was used to obtain hyperspectral images for the collected leaf samples. During image acquisition, a white reference board was used to obtain the white reference, and the black reference was acquired when the camera shutter was completely turned off and the lens was covered by a black lens cap. The raw images were automatically calibrated based on the above references. This system collects hyperspectral images covering the 400-1000 nm spectral range with a spectral resolution of 5 nm, and each image has a spatial resolution of 1360 × 1024 pixels. With the Scanner Unit software (JFE Techno-Research Corporation, Tokyo, Japan) attached to the system, the average spectral reflectance data for each leaf sample were extracted and pretreated using a Savitzky-Golay smoothing algorithm for further analysis.
After the nondestructive hyperspectral image acquisition, the total chlorophyll concentration in leaf samples was determined using a conventional, destructive, solvent-based pigment-extraction method combined with an absorption spectrometer [10]. A total of 4 discs with 12 mm diameters cut from each leaf sample were prepared for pigment extraction using acetone solvent. After pigment extraction, the absorbances at 645 nm, 663 nm and 750 nm in the solvents were measured using a Genesys 10 Bio UV-VIS spectrophotometer (Thermo Fisher Scientific Inc., Waltham, MA, USA). Using the equation proposed by Mackenney [53], the amounts of total chlorophylls in each leaf sample were calculated.

Data Analysis
During the analysis of hyperspectral data, it is always necessary to identify several so-called key wavelengths to reduce the high dimensionality of hyperspectral data. These key wavelengths can potentially be used to predict the chlorophyll content in the plant leaf. This is particularly important because the objective of this research is to develop a compact device with built-in specific-wavelength LEDs to measure the nutrient status of the plant leaf, and such a small device could not provide enough space to accommodate too many LEDs.
A few statistical methods have been proposed for selecting the key wavelengths from high-dimensional spectral data [10]. This study employed Pearson's correlation analysis [54] to identify the key wavelengths. The correlation coefficients between the reflectance of specific wavelengths and chlorophyll content were used as a criterion. The wavelengths whose reflectance had the highest correlations with chlorophyll content in certain wavelength ranges were selected. These key wavelengths were then used to develop a chlorophyll content prediction model using a multiple linear regression (MLR) algorithm. In modeling, the sample data were divided into 2 subsets: 2/3 of the entire data (80 samples) was used as the training set to develop the model and the remaining 1/3 of the entire data (40 samples)  the performance of the MLR model was used to assess the usefulness of the selected key wavelengths for evaluating the nutrient status of the plant leaf.

Design and Development of Camera Prototype and Software
The ultimate purpose of this research was to develop a low-cost and small-size camera for evaluating the nutrient status of fruit trees. The camera was expected to be able to capture the images of fruit leaf samples under narrowband light sources such as LEDs. The first prototype version of "Rice Scan" (Maxell Co., Tokyo, Japan) was used as the basis for the design and development of the camera prototype. "Rice Scan" is a small RGB camera developed for assessing the nutrient status in rice crops based on a leaf color index. The camera has been tested and evaluated across Japan, with varying degrees of success, and it became available in the market in 2018. However, shortly after the release of the product, due to certain reasons, the product development and services were terminated. In this study, a new camera prototype was equipped with multiple narrowband LED light sources instead of a single broadband white LED light source in "Rice Scan". These builtin LEDs emitted the narrowband specific-wavelength lights of the key wavelengths that were identified in the preceding experimental research.
To control the camera prototype, a simple software tool was designed and developed under the Visual Studio IDE environment (Microsoft Corporation, Redmond, WA, USA). With the help of the software, the camera prototype could be easily triggered to acquire images of leaf samples under each narrowband specific-wavelength LED light condition, and the images could be saved in a computer for further analysis.

Testing and Evaluation of Camera Prototype
To investigate the feasibility of using the camera prototype to assess the nutrient status of the apple leaf, the following experiments and data analyses were performed: firstly, the chlorophyll meter "SPAD-502plus" (Konika Minolta, Inc., Tokyo, Japan) and the leaf nitrogen meter "Agri Expert CCN6001" (Satake Co., Ltd., Hiroshima, Japan) were employed to measure the SPAD (index value, a relative indicator of chlorophyll content calculated based on the absorbances of the wavelengths 650 nm and 940 nm) and CCN (nitrogen content in % calculated based on the absorbances of 4 unpublished wavelengths) values of leaf samples; secondly, the camera prototype was used to capture the images of leaf samples under the identified narrowband specific-wavelength LED light conditions; thirdly, data extraction from the images was performed and the extracted data were then related to the SPAD and CCN values for the same samples by means of multiple linear regression analysis.
During the process of data analysis, the RGB data were first extracted from the images captured using the two camera prototypes. Because the prediction models for chlorophyll and nitrogen contents were developed based on the reflectance of key wavelengths, the reflectance intensity of specific-wavelength light emitted from the built-in LEDs needed to be calculated from the raw RGB data. To obtain the reflectance intensity of the specificwavelength LED emissions, the raw RGB data were converted to L*a*b* values. In an L*a*b* color space, the L* value is a measurement of the lightness or amount of light reflected; the a* value indicates the intensity of the red (+ values) or green (− values) color and the b* value indicates the intensity of the yellow (+ values) or blue (− values) color [55]. Therefore, the L* value represents the reflectance intensity of a specific-wavelength LED light, and can potentially be used to predict chlorophyll and nitrogen contents with the prediction models developed.
In this study, we investigated the possibility of using the data captured using the camera prototypes to predict the SPAD and CCN values. The L* values of the specificwavelength LEDs obtained for each prototype were used to develop multiple linear models for predicting the SPAD and CCN values. Considering the leaf color values obtained under the specific-wavelength LED illuminations may provide valuable nutrient-related information; the a* and b* values in addition to the L* values and the raw RGB values were also used to develop prediction models.
Because the commercial "SPAD" and "Agri Expert CCN" meters have been widely used in agricultural fields as the nondestructive standard tools to evaluate the nutrient status of plant leaves, a high correlation between the data obtained by the prototype and the SPAD and CCN values would confirm the possibility of the prototype being used as an alternative to these two standard tools. Figure 1 shows the reflectance spectra obtained in the experiment. Significant differences in the reflectance intensity of certain wavelengths, particularly those in the green to red, the red edge and the near infrared wavelength ranges were found among the leaf samples ( Figure 1a). Leaf samples were grouped with different levels of chlorophyll content, and the average reflectance of the groups was calculated (Figure 1b). It was found that as the levels of chlorophyll content increased, the group average reflectance spectra shifted downward sequentially, forming an orderly distribution of group spectra. Particularly, the wavelengths in the green to red and the red edge spectral ranges demonstrated larger reflectance intensity differences among different groups. This suggests that these wavelength ranges may be more closely associated with the chlorophyll content in apple leaves than other wavelengths. Our previous research also showed the similar relationship between reflectance intensity and nitrogen content in apple leaves and confirmed the usefulness of several key wavelengths in these ranges in estimating the nitrogen content in apple leaves [10].

Correlation between Spectral Reflectance and Chlorophyll Content
Pearson's correlation coefficients between leaf chlorophyll content and the reflectance of each wavelength are illustrated in Figure 2. The reflectance of most wavelengths individually showed a negative correlation with chlorophyll content. Among them, the wavelengths 710 nm and 560 nm demonstrated the highest negative correlations, achieving a significant correlation coefficient of −0.889 and −0.776 (p < 0.001), respectively. The wavelength 675 nm, whose correlation coefficient with chlorophyll content was the peak but the lowest in the range between the above 2 wavelengths (r = −0.229, p < 0.001), could be recognized as a characteristic wavelength because it possesses a significant feature in spectral reflectance compared to its longer or shorter adjacent wavelengths. Our previous study [10] revealed that the reflectance of the wavelengths around 560 nm (green) and 710 nm (red edge) was directly associated with leaf nitrogen content, while the wavelength at 675 nm (red) showed a significant correlation with leaf nitrogen content in terms of its reflectance's first derivatives (the rates of reflectance increased). Therefore, the three wavelengths were almost the same wavelengths as those found in our previous study [10]. Numerous previous studies have confirmed the significant relationships between chlorophyll content and nitrogen content in plant leaves [10,24,38,39]; therefore, the above 3 specific wavelengths (560 nm, 675 nm and 710 nm) could potentially be used as the key wavelengths to estimate both chlorophyll and nitrogen contents in apple leaves.

Correlation between Spectral Reflectance and Chlorophyll Content
Pearson's correlation coefficients between leaf chlorophyll content and the reflec tance of each wavelength are illustrated in Figure 2. The reflectance of most wavelengths individually showed a negative correlation with chlorophyll content. Among them, the wavelengths 710 nm and 560 nm demonstrated the highest negative correlations, achiev ing a significant correlation coefficient of −0.889 and −0.776 (p < 0.001), respectively. The wavelength 675 nm, whose correlation coefficient with chlorophyll content was the peak but the lowest in the range between the above 2 wavelengths (r = −0.229, p < 0.001), could be recognized as a characteristic wavelength because it possesses a significant feature in 675 nm (red) showed a significant correlation with leaf nitrogen content in terms of its reflectance's first derivatives (the rates of reflectance increased). Therefore, the three wavelengths were almost the same wavelengths as those found in our previous study [10]. Numerous previous studies have confirmed the significant relationships between chlorophyll content and nitrogen content in plant leaves [10,24,38,39]; therefore, the above 3 specific wavelengths (560 nm, 675 nm and 710 nm) could potentially be used as the key wavelengths to estimate both chlorophyll and nitrogen contents in apple leaves.

Chlorophyll Content Prediction Model
Based on the above analysis, 3 key wavelengths, i.e., 560 nm, 675 nm and 710 nm, were identified and used to develop an MLR model for chlorophyll content prediction. Figure 3 illustrates the performance of the MLR model for the prediction with both the training and the test data sets. The model achieved high predictive accuracy in the predictions with both the training (R 2 = 0.8585 and RMSE = 0.0029 (p < 0.001)) and the test (R 2 = 0.8018 and RMSE = 0.0035 (p < 0.001)) data sets, showing high consistency in the model predictions. This result confirmed the usefulness of the above three key wavelengths for estimating the chlorophyll content in apple leaves. These key wavelengths are among the key wavelengths that were selected for estimating the nitrogen content in apple leaves from earlier research [10]. show the highest negative correlation with chlorophyll content and the characteristic wavelength (675 nm) represents a peak between the above 2 wavelengths.

Chlorophyll Content Prediction Model
Based on the above analysis, 3 key wavelengths, i.e., 560 nm, 675 nm and 710 nm, were identified and used to develop an MLR model for chlorophyll content prediction. Figure 3 illustrates the performance of the MLR model for the prediction with both the training and the test data sets. The model achieved high predictive accuracy in the predictions with both the training (R 2 = 0.8585 and RMSE = 0.0029 (p < 0.001)) and the test (R 2 = 0.8018 and RMSE = 0.0035 (p < 0.001)) data sets, showing high consistency in the model predictions. This result confirmed the usefulness of the above three key wavelengths for estimating the chlorophyll content in apple leaves. These key wavelengths are among the key wavelengths that were selected for estimating the nitrogen content in apple leaves from earlier research [10].

Design and Development of Camera Prototype
In designing the camera prototype, three LEDs, each of which emitting light at one of the three key wavelengths, were used as the lighting for the prototype. In addition, another LED at the wavelength of 950 nm was added into the prototype, which, due to its high reflectance, has always been used as a reference in the development of similar devices [56,57]. Since the LEDs 675 nm and 710 nm were not available in the market, it was recommended to replace them with the LEDs 660 nm and 727 nm that had the closest wavelengths to each of the 2 key wavelengths available in the market. Finally, the commercial narrow-band LEDs manufactured by arm-OSRAM (OSRAM GmbH, München, Germany), LP M675 (560 nm), GH DASPA2.24 (660 nm), GF DASPA2.24 (727 nm) and SFH4646 (950 nm) were purchased for the prototype development. The relative spectral emissions of the four LEDs are shown in Figure 4. Each LED emits a light with the peak at/near the key wavelength and a narrow full width at half maximum (FWHM). The emission spectrum of the white LED used in "Rice Scan" (black solid line) and the CIE (International Commission on Illumination) standard photopic luminous efficiency function V λ (dashed line) are indicated for comparison.

Design and Development of Camera Prototype
In designing the camera prototype, three LEDs, each of which emitting light at one of the three key wavelengths, were used as the lighting for the prototype. In addition, another LED at the wavelength of 950 nm was added into the prototype, which, due to its high reflectance, has always been used as a reference in the development of similar devices [56,57]. Since the LEDs 675 nm and 710 nm were not available in the market, it was recommended to replace them with the LEDs 660 nm and 727 nm that had the closest wavelengths to each of the 2 key wavelengths available in the market. Finally, the commercial narrow-band LEDs manufactured by arm-OSRAM (OSRAM GmbH, München, Germany), LP M675 (560 nm), GH DASPA2.24 (660 nm), GF DASPA2.24 (727 nm) and SFH4646 (950 nm) were purchased for the prototype development. The relative spectral emissions of the four LEDs are shown in Figure 4. Each LED emits a light with the peak at/near the key wavelength and a narrow full width at half maximum (FWHM). The emission spectrum of the white LED used in "Rice Scan" (black solid line) and the CIE (International Commission on Illumination) standard photopic luminous efficiency function Vλ (dashed line) are indicated for comparison.  Due to limited space in the small prototype designed, only three LEDs could b added into the prototype. Therefore, two camera prototypes were developed separatel each with a different combination of three LEDs (Table 1). Figure 5 shows the diagram matic representation of the camera prototypes. The lighting system including three LED and a CMOS sensor (OMNIVISION OV7739) were integrated on the upper side of th board. A micro-USB port was designed on the lower part of the prototype.  Due to limited space in the small prototype designed, only three LEDs could be added into the prototype. Therefore, two camera prototypes were developed separately, each with a different combination of three LEDs (Table 1). Figure 5 shows the diagrammatic representation of the camera prototypes. The lighting system including three LEDs and a CMOS sensor (OMNIVISION OV7739) were integrated on the upper side of the board. A micro-USB port was designed on the lower part of the prototype.  The two camera prototypes developed are shown in Figures 6 and 7. The prototype is a palm-sized (70 mm height × 60 mm width × 18 mm depth) and low-weight (60 g) device, equipped with an entrance window of 3 mm × 3 mm and a CMOS sensor that can capture an RGB image with a resolution of 640 × 480 pixels for an area of 2 mm × 2 mm on the leaf sample. During measurement, a plant leaf was placed on the entrance window and the cover was put down closely on the leaf to eliminate the ambient light interference. A specific-wavelength LED was selected with the software developed in parallel, and the image was then captured under the emitted light condition. The two camera prototypes developed are shown in Figures 6 and 7. The prototype is a palm-sized (70 mm height × 60 mm width × 18 mm depth) and low-weight (60 g) device, equipped with an entrance window of 3 mm × 3 mm and a CMOS sensor that can capture an RGB image with a resolution of 640 × 480 pixels for an area of 2 mm × 2 mm on the leaf sample. During measurement, a plant leaf was placed on the entrance window and the cover was put down closely on the leaf to eliminate the ambient light interference. A specific-wavelength LED was selected with the software developed in parallel, and the image was then captured under the emitted light condition.
is a palm-sized (70 mm height × 60 mm width × 18 mm depth) and low-weight (60 g) device, equipped with an entrance window of 3 mm × 3 mm and a CMOS sensor that can capture an RGB image with a resolution of 640 × 480 pixels for an area of 2 mm × 2 mm on the leaf sample. During measurement, a plant leaf was placed on the entrance window and the cover was put down closely on the leaf to eliminate the ambient light interference. A specific-wavelength LED was selected with the software developed in parallel, and the image was then captured under the emitted light condition.  727 nm). The Japanese syllabary characters "カメラ" on the camera prototypes mean "camera" English.

Figure 7.
The camera prototype is equipped with an entrance window of 3 mm × 3 mm and a CMO sensor that can capture an RGB image with a resolution of 640 × 480 pixels. During measurement plant leaf was placed on the entrance window and the cover was put down closely on the leaf avoid ambient light interference. A specific-wavelength LED was selected with the software, a the image was then captured under the emitted light condition.

Design and Development of Software for Camera Prototype
A software tool for controlling the camera prototype was designed and develope under the Visual Studio IDE environment (Microsoft Corporation, Redmond, WA, USA Figure 8 shows the interface of the software to capture images using the camera prototyp In image acquisition, a specific-wavelength LED was first selected, and the light of t selected specific wavelength was emitted when the image was captured using clicking t "Capture" button. The captured image could be saved into a designated folder by clickin the "Save" button. The image shown in the figure was captured for a green leaf under t LED2 (660 nm) lighting condition using the Camera 1 prototype. Figure 7. The camera prototype is equipped with an entrance window of 3 mm × 3 mm and a CMOS sensor that can capture an RGB image with a resolution of 640 × 480 pixels. During measurement, a plant leaf was placed on the entrance window and the cover was put down closely on the leaf to avoid ambient light interference. A specific-wavelength LED was selected with the software, and the image was then captured under the emitted light condition.

Design and Development of Software for Camera Prototype
A software tool for controlling the camera prototype was designed and developed under the Visual Studio IDE environment (Microsoft Corporation, Redmond, WA, USA). Figure 8 shows the interface of the software to capture images using the camera prototype. In image acquisition, a specific-wavelength LED was first selected, and the light of the selected specific wavelength was emitted when the image was captured using clicking the "Capture" button. The captured image could be saved into a designated folder by clicking the "Save" button. The image shown in the figure was captured for a green leaf under the LED2 (660 nm) lighting condition using the Camera 1 prototype.
Sensors 2023, 23, x FOR PEER REVIEW 12 of 20 Figure 8. The interface of the software developed to capture images for the camera prototype. Figure 9 shows the images captured for the white board (used as a reference) and three apple leaf samples using the two camera prototypes. The images captured under the four LED lighting conditions show significant differences between one another. This suggests the possibility of extracting the reflectance of specific-wavelength light emitted from the built-in LEDs on the leaf from the images, which can be used for estimating the nutrient status of the leaf.   Figure 9 shows the images captured for the white board (used as a reference) and three apple leaf samples using the two camera prototypes. The images captured under the four LED lighting conditions show significant differences between one another. This suggests the possibility of extracting the reflectance of specific-wavelength light emitted from the built-in LEDs on the leaf from the images, which can be used for estimating the nutrient status of the leaf.  Figure 9 shows the images captured for the white board (used as a reference) and three apple leaf samples using the two camera prototypes. The images captured under the four LED lighting conditions show significant differences between one another. This suggests the possibility of extracting the reflectance of specific-wavelength light emitted from the built-in LEDs on the leaf from the images, which can be used for estimating the nutrient status of the leaf.

Relationships between SPAD, CCN Value and Chlorophyll Content
The relationships between the nondestructively measured SPAD and CCN values, and between the SPAD value and the chlorophyll content measured using a destructive, solvent-based pigment-extraction method combined with an absorption spectrometer were investigated. A significant correlation (r = 0.9318 (p < 0.001)) between the measured SPAD values and chlorophyll contents in leaves was found (Figure 10), suggesting the usefulness of the SPAD meter in evaluating the chlorophyll content in the apple leaf. This agrees with the results of numerous previous studies that have confirmed the effectiveness of using SPAD values to assess the nutrient status of a variety of plants and crops [42][43][44][50][51][52].

Relationships between SPAD, CCN Value and Chlorophyll Content
The relationships between the nondestructively measured SPAD and CCN values, and between the SPAD value and the chlorophyll content measured using a destructive, solvent-based pigment-extraction method combined with an absorption spectrometer were investigated. A significant correlation (r = 0.9318 (p < 0.001)) between the measured SPAD values and chlorophyll contents in leaves was found (Figure 10), suggesting the usefulness of the SPAD meter in evaluating the chlorophyll content in the apple leaf. This agrees with the results of numerous previous studies that have confirmed the effectiveness of using SPAD values to assess the nutrient status of a variety of plants and crops [42][43][44][50][51][52]. In addition, there was also a significant correlation (r = 0.9173 (p < 0.001)) between the measured SPAD and CCN values ( Figure 11). Although the "SPAD meter" and "Agri Expert CCN" meter were developed for measuring the chlorophyll and nitrogen contents in the leaf plant, respectively, the results measured using these two devices were significantly correlated with each other. Similar results have also been reported in the literature and the reasons have mainly been attributed to the close relationship between the chlorophyll and nitrogen contents in the plant leaf [10,24,38,39]. Our previous study confirmed that the SPAD value not only could be used to accurately predict the chlorophyll content, but it could also be used to predict the nitrogen content in the apple leaf with reasonable accuracy [47]. Furthermore, the CCN value could be directly interpreted into the actual level of nitrogen concentration in the plant leaf [58][59][60]. Therefore, if the camera prototype developed in this research could accurately predict the SPAD and CCN values, it could potentially be used as a new tool to evaluate the nutrient status of plants and crops. In addition, there was also a significant correlation (r = 0.9173 (p < 0.001)) between the measured SPAD and CCN values ( Figure 11). Although the "SPAD meter" and "Agri Expert CCN" meter were developed for measuring the chlorophyll and nitrogen contents in the leaf plant, respectively, the results measured using these two devices were significantly correlated with each other. Similar results have also been reported in the literature and the reasons have mainly been attributed to the close relationship between the chlorophyll and nitrogen contents in the plant leaf [10,24,38,39]. Our previous study confirmed that the SPAD value not only could be used to accurately predict the chlorophyll content, but it could also be used to predict the nitrogen content in the apple leaf with reasonable accuracy [47]. Furthermore, the CCN value could be directly interpreted into the actual level of nitrogen concentration in the plant leaf [58][59][60]. Therefore, if the camera prototype developed in this research could accurately predict the SPAD and CCN values, it could potentially be used as a new tool to evaluate the nutrient status of plants and crops.

Performance of SPAD and CCN Prediction Models Based on Images Captured Using Camera Prototype
The SPAD and CCN values for leaf samples predicted using the above models were plotted against the measured SPAD and CCN values, and simple correlation analyses were conducted to investigate the relationships between them. Figure 12 shows the scatter plots of the measured SPAD values vs. the predicted SPAD values. For Camera 1, the models based on the L*, L*a*b* and RGB data all performed excellently in predicting the SPAD values, with correlation coefficients being 0.9211, 0.9394 and 0.9387, respectively. In comparison, for Camera 2, the model based on the L* data obtained a correlation coefficient of 0.8834, which is significantly lower than the same model for Camera 1, and the models based on the L*a*b* and RGB data also performed well with predictive accuracies that were only slightly lower than those for Camera 1. These results indicated that, overall, Camera 1 was superior to Camera 2 in predicting the SPAD values, and the inclusion of color information in addition to the specific-wavelength reflectance intensities could improve the performance of the prediction models. Figure 13 shows the scatter plots of the measured CCN values vs. the predicted CCN values. Compared to the models for the SPAD values, the models for the CCN values all achieved a relatively lower prediction accuracy. For Camera 1, the correlation coefficients between the measured and the predicted CCN values for the models based on the L*, L*a*b* and RGB were 0.8264, 0.8686 and 0.8644, respectively. Furthermore, the models for Camera 2 performed worse than the corresponding models for Camera 1. Among them, the model based on the L* data obtained the lowest correlation coefficient of 0.6594, and the models based on the L*a*b* and RGB data did not perform satisfactorily, with the predictive accuracies being significantly lower than those for Camera 1. These results indicated that, similarly, Camera 1 was superior to Camera 2 in predicting the CCN values, and the usefulness of color information in improving the performance of prediction models was confirmed. The above results suggest that the images captured using the Camera 1 prototype may contain more nutrient-related information on "Fuji" apple leaves than those captured using the Camera 2 prototype.

Performance of SPAD and CCN Prediction Models Based on Images Captured Using Camera Prototype
The SPAD and CCN values for leaf samples predicted using the above models were plotted against the measured SPAD and CCN values, and simple correlation analyses were conducted to investigate the relationships between them. Figure 12 shows the scatter plots of the measured SPAD values vs. the predicted SPAD values. For Camera 1, the models based on the L*, L*a*b* and RGB data all performed excellently in predicting the SPAD values, with correlation coefficients being 0.9211, 0.9394 and 0.9387, respectively. In comparison, for Camera 2, the model based on the L* data obtained a correlation coefficient of 0.8834, which is significantly lower than the same model for Camera 1, and the models based on the L*a*b* and RGB data also performed well with predictive accuracies that were only slightly lower than those for Camera 1. These results indicated that, overall, Camera 1 was superior to Camera 2 in predicting the SPAD values, and the inclusion of color information in addition to the specific-wavelength reflectance intensities could improve the performance of the prediction models. Figure 13 shows the scatter plots of the measured CCN values vs. the predicted CCN values. Compared to the models for the SPAD values, the models for the CCN values all achieved a relatively lower prediction accuracy. For Camera 1, the correlation coefficients between the measured and the predicted CCN values for the models based on the L*, L*a*b* and RGB were 0.8264, 0.8686 and 0.8644, respectively. Furthermore, the models for Camera 2 performed worse than the corresponding models for Camera 1. Among them, the model based on the L* data obtained the lowest correlation coefficient of 0.6594, and the models based on the L*a*b* and RGB data did not perform satisfactorily, with the predictive accuracies being significantly lower than those for Camera 1. These results indicated that, similarly, Camera 1 was superior to Camera 2 in predicting the CCN values, and the usefulness of color information in improving the performance of prediction models was confirmed. The above results suggest that the images captured using the Camera 1 prototype may contain more nutrient-related information on "Fuji" apple leaves than those captured using the Camera 2 prototype.  The above findings demonstrated the potential usefulness of the camera prototyp in evaluating the nutrient status in the apple leaf. Particularly, Camera 1 had a superiori Figure 13. Performances of the CCN prediction models based on the data extracted from the images captured using the camera prototypes. Scatter plots of the measured CCN values vs. the predicted CCN values are illustrated: (a-c) are the results for the models based on L*, L*a*b* and RGB data, respectively, for Camera 1; (d-f) are the results for the models based on L*, L*a*b* and RGB data, respectively, for Camera 2. The lines indicate the best fit lines.
The above findings demonstrated the potential usefulness of the camera prototypes in evaluating the nutrient status in the apple leaf. Particularly, Camera 1 had a superiority over Camera 2 in predicting both the SPAD values and the CCN values, showing a more potential applicability for real practices in the future. This study serves as a preliminary step toward developing a low-cost, small-size and easy-to-use tool for the evaluation of nutrient status in plants and crops. The new tool is expected to be used as an alternative to the existing commercial tools. Compared to the tools currently available in the market (such as "SPAD", "GreenSeeker" and "Agri Expert CCN" meters), whose prices range from USD 800 to 2000, the total cost for each camera prototype developed in this study was only between USD 100 and 200. Such a low-cost tool will be very attractive to farmers. Future work will include the development of new versions of the camera prototypes with built-in Wi-Fi capability as well as the development of a mobile application for mobile phones and devices to remotely control the camera. Wi-Fi capability will enable the wireless mobility of the camera, and the new mobile app system is expected to be able to quickly calculate SPAD, CCN and chlorophyll/nitrogen content estimates based on the images captured using the camera, and will show the results on mobile devices in real time. These new features will further increase the usability of the tool in crop nutrient management in the future.

Conclusions
Our recent study revealed the possibility of using a simple MLR model based on four narrowband specific wavelengths to estimate the nitrogen content in apple leaves. With an aim to apply the above model in a practical situation, the present research developed a lowcost and small-size camera that could collect narrowband specific wavelength reflectance for evaluating the nutrient status of fruit trees. Two camera prototypes were developed by integrating three independent LEDs of specific wavelengths into a "Rice Scan" device. Using the camera prototypes, we acquired images of 'Fuji' apple leaves and investigated the possibility of using the images to estimate the nutrient status indicators obtained by the widely used "SPAD" and "Agri Expert CCN" devices. The results indicated that the data obtained by the Camera 1 prototype had higher correlations with the nutrient status indicators than those obtained by the Camera 2 prototype. This study suggests that the new low-cost and small-size camera with built-in specific-wavelength LED lighting developed in this study can potentially be applied to the evaluation of nutrient status in apple leaves.