Next Article in Journal
Design and Experiment of a Targeted Variable Fertilization Control System for Deep Application of Liquid Fertilizer
Previous Article in Journal
Using Block Kriging as a Spatial Smooth Interpolator to Address Missing Values and Reduce Variability in Maize Field Yield Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accuracy Comparison of Estimation on Cotton Leaf and Plant Nitrogen Content Based on UAV Digital Image under Different Nutrition Treatments

1
School of Agriculture, Shihezi University, Shihezi 843000, China
2
National and Local Joint Engineering Research Center of Information Management and Application Technology for Modern Agricultural Production (XPCC), Shihezi 832000, China
3
School of Agriculture, Gansu Agricultural University, Lanzhou 730000, China
*
Author to whom correspondence should be addressed.
Agronomy 2023, 13(7), 1686; https://doi.org/10.3390/agronomy13071686
Submission received: 4 May 2023 / Revised: 16 June 2023 / Accepted: 20 June 2023 / Published: 23 June 2023
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
The rapid, accurate estimation of leaf nitrogen content (LNC) and plant nitrogen content (PNC) in cotton in a non-destructive way is of great significance to the nutrient management of cotton fields. The RGB images of cotton fields in Shihezi (China) were obtained by using a low-cost unmanned aerial vehicle (UAV) with a visible-light digital camera. Combined with the data of LNC and PNC in different growth stages, the correlation between N content and visible light vegetation indices (VIs) was analyzed, and then the Random Forest (RF), Support Vector Machine (SVM), Back Propagation Neural Network (BP), and stepwise multiple linear regression (SMLR) were used to develop N content estimation models at different growth stages. The accuracy of the estimation model was assessed by coefficient of determination (R2), root mean squared error (RMSE), and relative root mean square error (rRMSE), so as to determine the optimal estimated growth stage and the best model. The results showed that the correlation between VIs and LNC was stronger than that between PNC, and the estimation accuracy of different models decreased continuously with the development of growth stages, with higher estimation accuracy in the peak squaring stage. Among the four algorithms, the best accuracy (R2 = 0.9001, RMSE = 1.2309, rRMSE = 2.46% for model establishment, and R2 = 0.8782, RMSE = 1.3877, rRMSE = 2.82% for model validation) was obtained when applying RF at the peak squaring stage. The LNC model for whole growth stages could be used in the later growth stage due to its higher accuracy. The results of this study showed that there is a potential for using an affordable and non-destructive UAV-based digital system to produce predicted LNC content maps that are representative of the current field nitrogen status.

1. Introduction

Cotton is an important fiber and oil crop in the world, and its planting area ranks first among cash crops in China. To ensure high and stable cotton production, chemical fertilizer application is a necessary and key step used by local farmers, while excessive application of fertilizers leads to higher costs, leaching, and soil environmental pollution, especially nitrogen (N) fertilizers, which are widely and waywardly used in cotton growth [1]. N content, as an important agronomic index and critical factor affecting cotton growth, is normally presented for cotton growth condition and yield prediction, as well as used for cotton nutrition diagnosis [2,3]. Therefore, developing a rapid and accurate estimation approach for cotton N content is of great significance for smart nutrition management in precision agriculture and also contributes to environmental protection and farmers’ economic benefits.
The traditional method to evaluate crop N status is chemical analysis, which is a destructive and time-consuming approach. Additionally, using handheld sensors (SPAD(Japan), Greenseeker(America), Li-cor(America) et al.) can indirectly evaluate crop N status at the small scale level, which cannot comprehensively represent crop canopy in the field or at the large scale level [4,5,6]. Remote sensing technology based on an unmanned aerial vehicle (UAV) has been widely used for image collection to evaluate crop growth conditions [7,8,9,10,11,12]. Most studies used RGB, near-infrared, multispectral, hyperspectral, and thermal infrared cameras or sensors based on UAVs for crop monitoring at a large scale [8,9,10,11,12,13,14]. Due to its affordability, flexibility, and high efficiency, as well as its band alignment, the RGB camera is mostly used in crop monitoring compared to multispectral and hyperspectral sensors, which indicates that the RGB camera based on UAVs has great potential for crop monitoring [15,16].
Leaf N content (LNC) or plant N content (PNC) is related to the capability of photosynthesis, which is highly affected by N fertilizer applications and crop yield; thus, many studies have been conducted to monitor the N nutrition status in leaves or plants of different crops by using UAV [2,13,17,18,19]. Jiang et al. [17] have shown that UAV-based RGB cameras have the capability to derive optimum vegetation indices (VIs) for LNC monitoring in winter wheat, and the study constructed a new VI (true color VI) from RGB images that can better mitigate saturation than other VIs. Lebourgeois et al. [16] used RGB and NIR-G-B sensors mounted on UAVs to detect N status and found the best correlation of LNC in sugarcane with a broadband version of the simple ratio pigment index (SRPIb) (R2 = 0.7) among all indices examined. Further, Schirrmann et al. [20] found the red and green ratio from UAV RGB imagery correlated well (R2 = 0.68) with PNC at the heading stage of winter wheat.
The aforementioned studies proved that UAV RGB imagery is a good tool for crop N nutrition status detection, but whether the cotton N content of leaves or plants at different growth stages could be detected by RGB sensors based on UAV still remains unclear. Most studies on the estimation of cotton N status used hyperspectral or multispectral sensors, which contain a large amount of band information and data for analysis, and extracting useful data from the complex data to construct a prediction model is always a tough technical challenge [21,22,23,24]. Moreover, those studies on the estimation of cotton N status using an RGB camera sensor were mostly carried out under different N treatments without considering the other fertilizer elements [25,26]. As the soil environment is not composed of simple major elements, it contains N, phosphorus, and potassium, three major elements, and different fertigation managements affect cotton N absorption and utilization, which could affect the plant N content. However, the cotton LNC or PNC estimation by using digital cameras under different nutrition conditions is poorly understood, and few studies have used digital camera-extracted data combined with machine learning technology to estimate the LNC or PNC of cotton at different growth stages.
Therefore, the objective of this work is to estimate the N content of drip-irrigated cotton in leaves and at plant level under combined N, phosphorus, and potassium treatments. To achieve this objective, thirty VIs extracted from visible digital images of low-cost UAVs, combined with three machine learning techniques and stepwise multiple linear regression (SMLR), were used to evaluate their performance in estimating N content in leaves and plants. The results of this study will provide a reference for precision agro-ecological applications and management of drip irrigation cotton fields in Xinjiang.

2. Materials and Methods

2.1. Experimental Design

Field experiments were carried out at the experiment farm of Shihezi University, Shihezi, Xinjiang, China (44°18′ N, 86°02′ E) in 2018, 2019, and 2020. A local cotton cultivar, Lumianyan 24, currently used in local production, was grown in the cotton field. The average soil fertility during 2018–2020 in the experimental site soil was alkaline and contained N 145.47 mg kg−1, Olsen P 36.18 mg kg−1, K2O 67.30 mg kg−1, and organic matter 31.50 g kg−1 at a pH of 8.15.
These experiments were designed using a randomized complete block with three replicates. Four N application rates (506, 402.5, 299, and 195.5 kg ha−1 designated as N1, N2, N3, and N4, respectively) and four different PK-Ms were designated as PK-M1 (the proportion of P and K application during squaring and bloom-bolling stages were 100%:0%), PK-M2 (the proportion of P and K application during squaring and bloom-bolling stages were 25%:75%), PK-M3 (the proportion of P and K application during squaring and bloom-bolling stages were 50%:50%), and PK-M4 (the proportion of P and K application during squaring and bloom-bolling stages were 75%:25%) (Table 1). Phosphorus and potassium fertilizers were applied to each plot, with P2O5 and K2O at 108 kg ha−1 and 97.2 kg ha−1, respectively. There were 48 plots in the experiments, and the size of each plot was 15 m × 2.25 m.
A plot seeder was used to sow seeds with a density of 11 plants per square meter in three seasons. The sowing dates in the three seasons were 21 April 2018, 24 April 2019, and 23 April 2020, respectively. The topping was conducted on 15 July 2018, 7 July 2019, and 10 July 2020. All fertilizer is applied with water every 7–10 days during the growth period, for a total of eight times.

2.2. Data Collection

In this study, 11 ground control points were marked on the ground in advance, and the coordinates of each control point were measured with GPS. The RGB camera from the DJI Mavic Pro (Table 2 and Figure 1) was used to collect data at each key growth stage of cotton. We chose the environmental conditions of stable light intensity, clear, cloudless weather, and wind speed less than level 4 to carry out continuous monitoring in the study area. Each flight activity was conducted from 12:00 to 14:00 local time. DJI GS PRO software was used to control the UAV, and each time according to the pre-defined flight plan with approximately 80% forward overlapping and 60% side overlapping, it acquired about 350 images (a spatial resolution of 1.3 cm). The frequency of image acquisition was 1 frame per 5 s, and the images were saved in JPEG format. Furthermore, the setting of the camera’s aperture was f/5. The UAV was set to automatic flying mode, flying at an altitude of 10 m and a speed of 2 m per second. Each flight activity took about 12 min. The same flight routes and camera settings were used in each growing season. According to data processing from Lu et al. (2019) [27], the acquired UAV digital images were used for image alignment, geographic reference, mosaicking, generation of dense point clouds, orthorectified image generation, and ground control point correction by Agisoft PhotoScan Professional software (Table 3) and ENVI. Firstly, the overlapping images were aligned automatically by using a feature point matching algorithm in software, and then all ground control points were used to georeference each image, and the internal parameters were estimated based on the image alignment and the ground control points. Secondly, the estimated parameters were used to compensate for a linear model misalignment while georeferencing the model, and we chose median filtering to build a dense point cloud. Lastly, the orthophotos were generated. Before the R, G, and B values were extracted from each plot in each growth stage, the image segmentation was processed by the Super-Green method (2 g-r-b) to obtain the Super-Green images, and then the image was processed in grayscale and enhanced by median filtering (Figure 2). The gray-level image was transformed into a binary image by fixed threshold segmentation (the value was 0.40). The color image with a black background was obtained by color filling, and the R, G, and B values in the three bands were separately set and extracted from the plot by using MATLAB R2018a (Mathworks Inc., Natick, MA, USA). After we extracted the R, G, and B values, we used the average data for further data analysis.
Plants from 48 plots were randomly sampled within a day before the UAV campaigns in each growth stage (Table 4). Three representative cotton plants in each plot were selected as one sample, put into paper bags, and brought back to the laboratory for follow-up treatment. The cotton plant was divided into leaves, stems, and reproductive organs and oven-dried at 105 °C for 30 min and afterwards at 80 °C until the dry matter weight of each organ was reached.
The oven-dried samples of different organs used to determine N content (Kjeldahl method) [28]. Samples of 0.1 g of dried and ground samples were digested using a mixture of H2O2 and H2SO4 until the solution became clear and transparent, and the N content was determined using a continuous-flow auto-analyzer (BRAN + LUEBBE AA3; Germany). The formula for calculating plant N content is as follows:
Plant N content (PNC, %) = (Leaf N content (%) × Leaf dry matter weight + Stem dry N content (%) × Stem dry matter weight + Reproductive organs N content (%) × Reproductive organs dry matter weight)/Aboveground dry matter weight.

2.3. Vegetation Indices

Based on the preprocessed UAV digital image, the 48 RGB values in each growth stage are extracted by MATLAB. According to the previous research results on the relationship between N and visible light band VIs, 30 VIs were selected to estimate the LNC and PNC content of cotton. The visible light VIs selected in this study are listed in Table 5.

2.4. Regression Techniques

In this study, three machine learning algorithms—Random Forest (RF), Support Vector Machine (SVM), Back Propagation Neural Network (BP) and stepwise multiple linear regression (SMLR)—were conducted by Data Processing System software.
The two important parameters (mtry and ntree) in the RF were set to 10 and 1000; the Support Vector Machine chose the radial basis function (RBF) to solve the nonlinear relationship in the data, and the cost value was set to 1. The BP neural network adopted a three-layer structure: the input layer, the hidden layer, and the output layer are respectively set up with 30-21-1, the number of iterations is 1000, and the training rate is 0.1.

2.5. Accuracy Assessment

From the samples collected in each growing season, we pooled the data from 3 years and all growing conditions and stages to form a dataset. We used the hold-out method to split the dataset into two parts, with 2/3 for model calibration and 1/3 for model validation.
In this study, the performance of each regression model was compared by three indexes: determination coefficient (R2), root mean square error (RMSE), and relative root mean square error (rRMSE). R2 is used to represent the fitting effect between the predicted value and the measured value, and the closer the value is to 1, the higher the fitting accuracy of the model. The root mean square error reflects the discreteness between the predicted value and the measured value, so the smaller the value, the better the fitting accuracy of the estimation model is. The larger the R2 of the inversion model and its verification, and the smaller the RMSE and rRMSE, the better the estimation ability of the model. The calculation formula is as follows:
R 2 = i = 1 n ( X i X ) ¯ 2 ( Y i Y ¯ ) 2 n i = 1 n ( X i X ¯ ) 2 i = 1 n ( Y i Y ¯ ) 2
RMSE = i = 1 n ( Y i X i ) 2 n
rRMSE ( 100 % ) = RMSE X ¯ × 100
where X i , X ¯ , Y i , and Y ¯ represent the measured value, the mean value, the average value of the predicted value, and the average value of the predicted value, respectively, and n represents the sample number of the developed model.

3. Result

3.1. Variability in the Measured N Content and RGB

Figure 3 shows the overall trend of LNC and PNC status from the five growth stage data sets during 2018–2020. The trend of LNC and PNC decreased along with growth stages, and peak points were at the peaking squaring stage. The LNC varied from 16.80 g kg−1 to 59.11 g kg−1 across growth stages. Average LNC decreased from 49.75 g kg−1 at peak squaring to 27.20 g kg−1 at boll-opening stages. The PNC varied from 6.08 g kg−1 to 49.89 g kg−1 across growth stages. Average PNC decreased from 40.21 g kg−1 at peak squaring to 17.48 g kg−1 at boll-opening stages. The coefficient of variation in LNC and PNC increased from peak squaring stages to boll-opening stages (Table 6). Based on the analysis of variance, the year, N treatment, and PK-Ms significantly affected LNC and PNC, and there was an interaction between N and PK-Ms on LNC and PNC during 2018–2020 (Table 7).
Figure 4 and Table 8 show the overall trend of R, G, and B status from the five growth stage data sets during 2018–2020. The trend of R value increased first and then decreased along with growth stages, except in 2018. The R value varied from 73.53 to 143.29 across growth stages. The average R value increased from 94.74 at peak squaring to 125.71 at boll-opening stages. The trend of G value increased first and then decreased at the boll-opening stage, and the G value varied from 72.79 to 183.46 across growth stages. The lowest average G value was 120.81 at the peak flowering stage, and the highest G value was 141.88 at the late peak boll-forming stage. The trend of the B value increased along with growth stages, and the B value varied from 65.70 to 140.96 across growth stages. The lowest average B value was 86.50 at the peak squaring stage, and the highest average B value was 124.97 at the boll-opening stages.

3.2. Correlation between UAV-Derived Variables

Based on the UAV RGB band information, thirty VIs were selected, referring to the previous research results and the relationship between N and visible light band VIs (Figure 5). For the LNC, twenty-three of the thirty Vis showed positive or negative correlations. GBRI, bn, ExB, g-b, INT, and r-b were the most strongly and weakly correlated to LNC, respectively (GBRI and g-b: r = 0.80, p-value < 0.01; bn and ExB: r = −0.80, p-value < 0.01; INT: r = 0.05, p-value > 0.05; r-b: r = −0.05, p-value > 0.05). For the PNC, twenty eight of the thirty VIs showed positive or negative correlations. The highest correlations with PNC were found for bn (r = 0.79, p-value < 0.01), with the lowest for r-b (r = 0.01, p-value > 0.05). Generally, the correlations of VIs with PNC were weaker than those with LNC. Among the twenty-three of thirty VIs in LNC and twenty-eight of thirty VIs in PNC, the input parameters for model development were that the correlation coefficient between VIs and LNC (PNC) was greater than 0.50.

3.3. Comparison of N Content Estimation Performance with Machine Learning Techniques and SMLR

Table 9 lists three machine learning techniques and SMLR for the estimation of LNC and PNC over the critical growth stages. For LNC, peak-squaring growth-stage-specific models yield the best performance, with R2 ≥ 0.7566, RMSE ≤ 1.9088, and rRMSE ≤ 3.82% in three machine learning techniques and SMLR. For PNC, peak-squaring growth-stage-specific models yield the best performance, with R2 ≥ 0.7468, RMSE ≤ 1.7663, and rRMSE ≤ 4.38% in three machine learning techniques and SMLR. For both LNC and PNC, the growth-stage-specific models at boll-opening stages show the worst performance, with R2 ≤ 0.5191, RMSE ≥ 3.2787, and rRMSE ≥ 11.92%. The RF of the three machine learning techniques could yield higher R2, lower RMSE, and rRMSE values than BP, SVM, and SMLR.
The estimation model of cotton N content should not only have high accuracy but also have repeatability and universality. Therefore, the validation set data were used to verify the model constructed above; the predicted value of N content and the measured value in the field were linearly fitted, and the reliability of the selected model was shown in a 1:1 scatter plot (Figure 6). Consistently, according to R2, the model performances are ordered as follows: Peak squaring > Peak flowering > Early peak boll-forming > Late peak boll-forming > Boll-opening. Peak-squaring growth-stage-specific models yield the best performance for LNC and PNC in validation. RF yielded the highest accuracy for LNC (R2 = 0.8782, RMSE = 1.3877, rRMSE = 2.82%) and PNC (R2 = 0.8622, RMSE = 1.3188, rRMSE = 3.30%). The BP has the lowest estimation accuracy among the three models. The growth-stage-specific models at boll-opening stages yielded the worst performance for LNC and PNC, with R2 below 0.5362, RMSE above 3.2019, and rRMSE above 12.04%.

4. Discussion

The goal of this study was to predict cotton N content at leaf and plant level by using UAV imagery under different fertigation management and compare the performance of the predicted model on LNC and PNC at different growth stages. LNC and PNC, being indicators for N nutrition status, were mostly used in remote sensing technology for crop N monitoring and prediction [2,7,10,15,17,18]. As crop N status affects the chlorophyll content in plants, resulting in changes in visible light absorption and reflection, crop N nutrition diagnosis can be realized by using visible light digital image information [21,22,26,42,43]. Previous studies have shown that UAV-mounted RGB sensors have the best potential for crop N content prediction, and N content is sensitive to visible bands in the whole cotton growth stage [16,17,21]. In this study, based on the results of correlation analysis, the selected VIs have a strong correlation with cotton LNC and PNC, most of them at a significant level, which indicates that there is great potential in using digital images to estimate cotton N nutrition [16,17,20,21,22].
In this study, the UAV digital image was used to construct the estimation model of N content in leaves and plants of cotton at each key growth stage. It was found that with the development of the growth stage, the accuracy of the model constructed by digital images decreased continuously. The validation results also showed the same trend, which is similar to the results of Cao et al. [44] and Bending et al. [40], that the accuracy of VIs in estimating N status and biomass in the early growth stage is higher than that in the later growth stage. The reason for the higher accuracy in N content estimation found in crop early growth stages might be due to lower chlorophyll content. As high chlorophyll content in crops at later growth stages would easily make the saturation in visible light absorption. Further, the estimation accuracy of the model based on visible light VIs in LNC is better than that of PNC in different growth stages, which resulted from the vertical observation mode adopted in this study, and the contribution rate of canopy leaves to the spectral information of the visible band is absolutely dominant; another reason for that might be that the assembly and distribution of N in cotton is accompanied by the change of plant structure, and the variability of the PNC data set is always higher than that of LNC in the reproductive growth stage (Table 5). In addition, the accuracy of the model developed by VIs in the boll-opening stage is much lower than that in the other four stages, which is due to the senescence and shedding of leaves in the boll-opening stage, the canopy image being affected by soil spectral reflection, and the mixed information contained in the image pixels being more complex. However, in cotton production, N fertilizer is usually no longer applied in the boll-opening stage, so the low accuracy model of cotton LNC and PNC in this period has no production significance, which is similar to the results in Wang et al. [21], Ma et al. [24], and Kou et al. [26] that lower accuracy in N content was found in the boll-opening stage and better prediction in N content was found in earlier growth stages. The accuracy of the model based on the all-stage model is generally better than that in the late peak boll-forming stage, so the all-stage model can be used instead of the late peak boll-forming stage model to obtain higher accuracy, which fills the gap for N content estimation in the later growth stage of cotton compared to previous research [21,22,23,24,25].
In this study, RGB and its combined VIs are used as input variables to estimate LNC and PNC for evaluating the performance of three machine learning techniques. The results showed that the performance of RF was better than that of SVM and BP neural networks, which is similar to previous research [45,46,47,48]. RF is a popular nonlinear integrated statistical method that has a good tolerance to noise, avoids overfitting, and can solve nonlinear problems through kernel functions [46,47], so it has higher fitting accuracy than SMLR. SVM was widely used in the field of remote sensing, and it was suitable for datasets with small sample sizes and high feature dimensions [48]. As one of the typical models of artificial neural networks, the BP neural network is composed of an input layer, a hidden layer, and an output layer, which can solve complicated nonlinear mapping problems [27,49]. In this study, all VIs were calculated by RGB, so there was a strong correlation between them. Further, the RF algorithm is insensitive to the collinearity of variables [50], so it is more suitable to deal with multiple interrelated variables and does not need to test the normality and independence of variables, so it can obtain higher model accuracy. The reason why the accuracy of BP was lower than that of the other models may be that its convergence speed is slow or the learning rate is unstable in the learning process.
Some studies focused on developing a model for estimation of crop N content under different N treatments or under combined N treatments and plant densities without considering the other two major elements in the field during crop growth stages [21,22,23,24,25,26,51]. While the crop N content was affected by field fertigation management under drip-irrigated conditions, there were interaction effects among N, phosphorus, and potassium in the soil, which affected crop canopy N content, canopy reflectance, and model accuracy. In this study, the LNC and PNC were predicted based on UAV images under combined N, phosphorus, and potassium treatments, and the result was satisfactory. However, this study obtained a fixed-resolution digital image at a flight altitude of 10 m, which was relatively lower than other studies [19,25,26,27]. The flight altitude is one of the most important parameters affecting the observation accuracy, and the higher the flight altitude, the lower the spatial resolution obtained. Zhu et al. (2023) and He et al. (2022) [52,53] found that the higher flight altitude led to a larger sampling area in each pixel with fewer sampling quantities, which indicated that there was a higher possibility of decreasing the frequency of feature values and affecting the accuracy of value extraction due to increased background interference. Further, flight altitudes of 10 m or less could decrease the effect of relative humidity, air mass density, and atmospheric transmittance on data acquisition but increase the flight time and cost of UAVs [52,53,54]. Our previous studies have shown that it is a useful and efficient method to set the flight altitude at 10 m with 1.3 cm ground resolution for image acquisition and cotton detection [55,56,57], while whether the different flight altitude would affect the accuracy of the cotton growth model still needs to be further discussed.

5. Conclusions

This study presented a cotton N estimation model at leaf and plant level based on UAV images under different fertigation managements and compared the estimation accuracy of the N model between leaf and plant with three machine learning algorithms and stepwise multiple linear regression at different growth stages. The following conclusions were drawn:
The cotton N content of leaf or plant level at different growth stages could be detected by RGB sensors based on UAV, and the estimation accuracy in leaf was better than that in plant with the random forest algorithm at each growth stage, and higher accuracy was found in the peak squaring stage.
The LNC content estimation model using the random forest algorithm for whole growth stages could be used in later growth stages due to its higher accuracy.
The results of this study showed that there is potential for using color indices derived from UAV-based digital systems to produce predicted LNC content maps. In future work, texture features could be taken into consideration, as could combinations of texture and color features, to improve the accuracy of the LNC estimation model for covering more growth stages.

Author Contributions

Conceptualization, Y.L. (Yang Liu) and F.M.; methodology, Y.L. (Yang Liu) and M.W.; software, Y.L. (Yang Lu); validation, Y.L. (Yang Liu), Y.C. and M.W.; formal analysis, Y.L. (Yang Liu); investigation, Y.C. and M.W.; resources, Y.L. (Yang Liu); data curation, Y.L. (Yang Liu) and M.W.; writing—original draft preparation, Y.L. (Yang Liu); writing—review and editing, Y.L. (Yang Liu); visualization, M.W.; supervision, Y.L. (Yang Liu) and F.M.; project administration, Y.L. (Yang Liu); funding acquisition, Y.L. (Yang Liu). All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by [National Natural Science Foundation of China] grant number [31860346] And [Scientific and Technology Program of Xinjiang Production and Construction Corps] grant number [2020AB017] And [Shihezi University Scientific Research Cultivation Project for Young Scholars] grant number [CXBJ202001]. Besides, we thank to Precision Agriculture Company for N concentration measurement.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hou, X.H.; Xiang, Y.Z.; Fan, J.L.; Zhang, F.C.; Hu, W.H.; Yan, F.L.; Guo, J.J.; Xiao, C.; Li, Y.P.; Cheng, H.L.; et al. Evaluation of cotton N nutrition status based on critical N dilution curve, N uptake and residual under different drip fertigation regimes in Southern Xinjiang of China. Agric. Water Manag. 2021, 256, 107134. [Google Scholar] [CrossRef]
  2. Tarpley, L.; Reddy, K.R.; Sassenrath-Cole, G.F. Reflectance indices with precision and accuracy in predicting cotton leaf nitrogen concentration. Crop Sci. 2000, 40, 1814–1819. [Google Scholar] [CrossRef]
  3. Chen, Y.; Wen, M.; Lu, Y.; Li, M.H.; Lu, X.; Yuan, J.; Liu, Y.; Ma, F.Y. Establishment of a critical nitrogen dilution curve for drip-irrigated cotton under reduced nitrogen application rates. J. Plant Nutr. 2022, 12, 1786–1798. [Google Scholar] [CrossRef]
  4. Yu, H.; Wu, H.S.; Wang, Z.J. Evaluation of SPAD and Dualex for in-season corn nitrogen status estimation. Acta Agron. Sin. 2010, 36, 840–847. [Google Scholar]
  5. Ali, A.M.; Thind, H.S.; Varinderpal, S.; Bijay, S. A framework for refining nitrogen management in dry direct-seeded rice using GreenSeeker™ optical sensor. Comput. Electron. Agric. 2015, 110, 114–120. [Google Scholar] [CrossRef]
  6. Yao, H.; Zhang, Y.; Yi, X.; Zuo, W.; Lei, Z.; Sui, L.; Zhang, W. Characters in light-response curves of canopy photosynthetic use efficiency of light and N in responses to plant density in field-grown cotton. Field Crops Res. 2017, 203, 192–200. [Google Scholar] [CrossRef]
  7. Cao, Q.; Miao, Y.; Wang, H.; Huang, S.; Cheng, S.; Khosla, R.; Jiang, R. Non-destructive estimation of rice plant nitrogen status with Crop Circle multispectral active canopy sensor. Field Crops Res. 2013, 154, 133–144. [Google Scholar] [CrossRef]
  8. Feng, W.; Zhang, H.Y.; Zhang, Y.S.; Qi, S.L.; Heng, Y.R.; Guo, B.B.; Guo, T.C. Remote detection of canopy leaf nitrogen concentration in winter wheat by using water resistance vegetation indices from in-situ hyperspectral data. Field Crops Res. 2016, 198, 238–246. [Google Scholar] [CrossRef]
  9. Greaves, H.E.; Vierling, L.A.; Eitel, J.U.; Boelman, N.T.; Magney, T.S.; Prager, C.M.; Griffin, K.L. High-resolution mapping of aboveground shrub biomass in Arctic tundra using airborne lidar and imagery. Remote Sens. Environ. 2016, 184, 361–373. [Google Scholar] [CrossRef]
  10. Li, M.; Shamshiri, R.; Weltzien, C.; Schirrmann, M. Crop monitoring using Sentinel-2 and UAV multispectral imagery: A comparison case study in Northeastern Germany. Remote. Sens. 2022, 14, 4426. [Google Scholar] [CrossRef]
  11. Fu, Y.; Yang, G.; Song, X.; Li, Z.; Xu, X.; Feng, H.; Zhao, C. Improved estimation of winter wheat aboveground biomass using multiscale textures extracted from UAV-based digital images and hyperspectral feature analysis. Remote Sens. 2021, 13, 581. [Google Scholar] [CrossRef]
  12. Du, L.; Huan, Y.; Song, X.; Wei, N.; Yu, C.; Wang, W.; Zhao, Y. Estimating leaf area index of maize using UAV-based digital imagery and machine learning methods. Sci. Rep. 2022, 12, 15937. [Google Scholar] [CrossRef] [PubMed]
  13. Zheng, H.; Cheng, T.; Yao, X.; Deng, X.; Tian, Y.; Cao, W.; Zhu, Y. Detection of rice phenology through time series analysis of ground-based spectral index data. Field Crops Res. 2016, 198, 131–139. [Google Scholar] [CrossRef]
  14. Wu, S.; Yang, P.; Ren, J.; Chen, Z.; Liu, C.; Li, H. Winter wheat LAI inversion considering morphological characteristics at different growth stages coupled with microwave scattering model and canopy simulation model. Remote Sens. Environ. 2020, 240, 111681. [Google Scholar] [CrossRef]
  15. Hunt, E.R.; Cavigelli, M.; Cst, D.; Mcmurtrey, J.I.; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  16. Lebourgeois, V.; Bégué, A.; Labbé, S.; Houlès, M.; Martiné, J.F. A light-weight multi-spectral aerial imaging system for nitrogen crop monitoring. Precis. Agric. 2012, 13, 525–541. [Google Scholar] [CrossRef]
  17. Jiang, J.L.; Cai, W.D.; Zheng, H.B.; Cheng, T.; Tian, Y.C.; Zhu, Y.; Ehsani, R.; Hu, Y.Q.; Niu, Q.S.; Gui, L.J.; et al. Using digital cameras on an unmanned aerial vehicle to derive optimum color vegetation indices for leaf nitrogen concentration monitoring in winter wheat. Remote Sens. 2019, 11, 2667. [Google Scholar] [CrossRef] [Green Version]
  18. Hunt, E.; Horneck, D.; Spinelli, C.; Turner, R.; Bruce, A.; Gadler, D.; Brungardt, J.; Hamm, P. Monitoring nitrogen status of potatoes using small unmanned aerial vehicles. Precis. Agric. 2018, 19, 314–333. [Google Scholar] [CrossRef]
  19. Li, S.; Ding, X.; Kuang, Q.; Ata-UI-Karim, S.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Potential of UAV-based active sensing for monitoring rice leaf nitrogen status. Front. Plant Sci. 2018, 9, 1834. [Google Scholar] [CrossRef] [Green Version]
  20. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.H. Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef] [Green Version]
  21. Wang, K.; Pan, W.; Li, S.; Cheng, B.; Xiao, H.; Wang, F.; Chen, J. Monitoring models of the plant nitrogen content based on cotton canopy hyperspectral reflectance. Spectrosc. Spectr. Anal. 2011, 31, 1868–1872. [Google Scholar]
  22. Qin, S.; Ding, Y.; Zhou, Z.; Zhou, M.; Wang, H.; Xu, F.; Yao, Q.; Lv, X.; Zhang, Z.; Zhang, L. Study on the nitrogen content estimation model of cotton leaves based on “imagespectrum-fluorescence” data fusion. Front. Plant Sci. 2023, 14, 1117277. [Google Scholar] [CrossRef] [PubMed]
  23. Li, L.; Li, F.; Liu, A.; Wang, X. The prediction model of nitrogen nutrition in cotton canopy leaves based on hyperspectral visible-near infrared band feature fusion. Biotechnol. J. 2023, 2200623. [Google Scholar] [CrossRef] [PubMed]
  24. Ma, L.; Chen, X.; Zhang, Q.; Lin, J.; Yin, C.; Ma, Y.; Yao, Q.; Feng, L.; Zhang, Z.; Lv, X. Estimation of nitrogen content based on the hyperspectral vegetation indexes of interannual and multi-temporal in cotton. Argonomy 2022, 12, 1319. [Google Scholar] [CrossRef]
  25. Jamil, N.; Kootstra, G.; Kooistra, L. Evaluation of individual plant growth estimation in an intercropping field with UAV imagery. Agriculture 2022, 12, 102. [Google Scholar] [CrossRef]
  26. Kou, J.; Duan, L.; Yin, C.; Ma, L.; Chen, X.; Gao, P.; Lv, X. Predicting leaf nitrogen content in cotton with UAV RGB images. Sustainablity 2022, 14, 9259. [Google Scholar] [CrossRef]
  27. Lu, N.; Zhou, J.; Han, Z.; Li, D.; Cao, Q.; Yao, X.; Cheng, T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 2019, 15, 17. [Google Scholar] [CrossRef] [Green Version]
  28. Bremner, J.M.; Mulvancy, C.S. Nitrogen-total. In Methods of Soil Analysis, Part 2; Page, A.L., Miller, R.H., Keeney, D.R., Eds.; American Society of Agronomy: Madison, WI, USA, 1982; pp. 595–624. [Google Scholar]
  29. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  30. Gamon, J.A.; Qiu, H. Ecological applications of remote sensing at multiple scales. Handb. Funct. Plant Ecol. 1999, 805, 846. [Google Scholar]
  31. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
  32. Mao, W.; Wang, Y.; Wang, Y. Real-time detection of between-row weeds using machine vision. In Proceedings of the 2003 ASAE Annual Meeting, Las Vegas, NV, USA, 27–30 July 2003. [Google Scholar]
  33. Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef] [Green Version]
  34. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  35. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  36. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  37. Neto, J.C. A Combined Statistical-Soft Computing Approach for Classification and Mapping Weed Species in Minimum-Tillage Systems. Ph.D. Thesis, University of Nebraska, Lincoln, NE, USA, 2004. Available online: http://digitalcommons.unl.edu/dissertations/AAI3147135 (accessed on 3 May 2023).
  38. Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  39. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Kobe, Japan, 20–24 July 2003; pp. b1079–b1083. [Google Scholar]
  40. Bending, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar]
  41. Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 2010, 143, 105–117. [Google Scholar] [CrossRef]
  42. Blackmer, T.M.; Schepers, J.S.; Varvel, G.E.; Walter-Shea, E.A. Nitrogen deficiency detection using reflected shortwave radiation from irrigated corn canopies. Agron. J. 1996, 88, 1–5. [Google Scholar] [CrossRef] [Green Version]
  43. Zhao, D.; Reddy, K.R.; Kakani, V.G.; Reddy, V.R. Nitrogen deficiency effects on plant growth, leaf photosynthesis, and hyperspectral reflectance properties of sorghum. Eur. J. Agron. 2005, 22, 391–403. [Google Scholar] [CrossRef]
  44. Cao, Q.; Miao, Y.; Feng, G.; Gao, X.; Li, F.; Liu, B.; Khosla, R. Active canopy sensing of winter wheat nitrogen status: An evaluation of two sensor systems. Comput. Electron. Agric. 2015, 112, 54–67. [Google Scholar] [CrossRef]
  45. Wei, G.; Li, Y.; Zhang, Z.; Chen, Y.; Chen, J.; Yao, Z.; Chen, H. Estimation of soil salt content by combining UAV-borne multispectral sensor and machine learning algorithms. PeerJ 2020, 8, e9087. [Google Scholar] [CrossRef] [PubMed]
  46. Grimm, R.; Behrens, T.; Märker, M.; Elsenbeer, H. Soil organic carbon concentrations and stocks on Barro Colorado Island—Digital soil mapping using Random Forests analysis. Geoderma 2008, 146, 102–113. [Google Scholar] [CrossRef]
  47. Yue, J.; Yang, G.; Feng, H. Comparative of remote sensing estimation models of winter wheat biomass based on random forest algorithm. Trans. Chin. Soc. Agric. Eng. 2016, 32, 175–182. [Google Scholar]
  48. Pal, M. Kernel methods in remote sensing: A review. ISH J. Hydraul. Eng. 2009, 15, 194–215. [Google Scholar] [CrossRef]
  49. Xiu, L.; Zhang, H.; Guo, Q.; Wang, Z.; Liu, X. Estimating nitrogen content of corn based on wavelet energy coefficient and BP neural network. In Proceedings of the 2015 2nd International Conference on Information Science and Control Engineering, Shanghai, China, 24–26 April 2015; pp. 212–216. [Google Scholar]
  50. Cutler, D.R.; Edwards, T.C., Jr.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J. Random forests for classification in ecology. Ecology 2007, 88, 2783–2792. [Google Scholar] [CrossRef]
  51. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  52. Zhu, C.; Zhao, W.; Shan, Z.; Li, J. Determination of UAV altitude and take-off time in the design of a variable rate irrigation prescription map. Trans. Chin. Soc. Agric. Eng. 2023, 5, 61–69. (In Chinese) [Google Scholar]
  53. He, Y.; Du, X.; Zheng, L.; Zhu, J.; Cen, H.; Xu, L. Effects of UAV flight height on estimated fractional vegetation cover and vegetation index. Trans. Chin. Soc. Agric. Eng. 2022, 24, 63–72. (In Chinese) [Google Scholar]
  54. Jin, R.; Zhao, L.; Zheng, L.; Zhong, X.; Ren, P. Influence of observation height of UAVs low altitude thermal infrared remote sensing on land surface temperature retrieval. Buliding Sci. 2022, 2, 89–98. (In Chinese) [Google Scholar]
  55. Wang, L.; Liu, Y.; Wen, M.; Li, M.; Dong, Z.; He, Z.; Cui, J.; Ma, F. Prediction of cotton yield reduction after hail damage using a UAV-based digital camera. Agron. J. 2021, 6, 5235–5245. [Google Scholar] [CrossRef]
  56. Wang, L.; Wen, M.; Li, P.; Li, M.; Dong, Z.; Cui, J.; Liu, Y.; Ma, F. Growth and yield responses of drip-irrigated cotton to two different methods of simulated hail damages. Arch. Agron. Soil Sci. 2021, 9, 1272–1284. [Google Scholar] [CrossRef]
  57. Jiang, J.; Li, R.; Ma, X.; Li, M.; Liu, Y.; Lu, Y.; Ma, F. Estimation of the quantity of drip-irrigated cotton seedling based on color and morphological features of UAV captured RGB images. Cotton Sci. 2022, 6, 508–522. (In Chinese) [Google Scholar]
Figure 1. The UAV used in this study. Note: (a) The UAV DJI Mavic Pro equipped with the RGB camera FC220 (b).
Figure 1. The UAV used in this study. Note: (a) The UAV DJI Mavic Pro equipped with the RGB camera FC220 (b).
Agronomy 13 01686 g001
Figure 2. Cotton canopy image segmentation. Original image (a), image processed by the super-green method (b), binary image after threshold segmentation (c), color image after segmentation (d).
Figure 2. Cotton canopy image segmentation. Original image (a), image processed by the super-green method (b), binary image after threshold segmentation (c), color image after segmentation (d).
Agronomy 13 01686 g002aAgronomy 13 01686 g002b
Figure 3. Box plots of measured leaf nitrogen content and plant nitrogen content in cotton at different growth stages.
Figure 3. Box plots of measured leaf nitrogen content and plant nitrogen content in cotton at different growth stages.
Agronomy 13 01686 g003
Figure 4. Box plots of cotton canopy R, G, and B values at different growth stages.
Figure 4. Box plots of cotton canopy R, G, and B values at different growth stages.
Agronomy 13 01686 g004
Figure 5. Pearson’s correlation coefficient (r) between individual UAV-derived vegetation indices and leaf nitrogen content and plant nitrogen content.
Figure 5. Pearson’s correlation coefficient (r) between individual UAV-derived vegetation indices and leaf nitrogen content and plant nitrogen content.
Agronomy 13 01686 g005
Figure 6. Scatterplots of measured leaf nitrogen content, plant nitrogen content, predicted leaf nitrogen content, and plant nitrogen content in cotton. (Left: Random Forest (RF), second column: Support Vector Machine (SVM), third column: Back Propagation Neural Network (BP). Right: Stepwise Multiple Linear Regression (SMLR)) from Peak squaring stage (ad), Peak flowering stage (eh), Early peak boll-forming stage (il), Late peak boll-forming stage (mp), Boll-opening stage (qt), All stage (ux). Red circle for leaf nitrogen content (LNC) and blue triangle for plant nitrogen content (PNC), the number of samples was 48 for each growth stage from 3 years.
Figure 6. Scatterplots of measured leaf nitrogen content, plant nitrogen content, predicted leaf nitrogen content, and plant nitrogen content in cotton. (Left: Random Forest (RF), second column: Support Vector Machine (SVM), third column: Back Propagation Neural Network (BP). Right: Stepwise Multiple Linear Regression (SMLR)) from Peak squaring stage (ad), Peak flowering stage (eh), Early peak boll-forming stage (il), Late peak boll-forming stage (mp), Boll-opening stage (qt), All stage (ux). Red circle for leaf nitrogen content (LNC) and blue triangle for plant nitrogen content (PNC), the number of samples was 48 for each growth stage from 3 years.
Agronomy 13 01686 g006
Table 1. The experimental design in this study.
Table 1. The experimental design in this study.
N TreatmentsPK-MsProportion for Squaring StageProportion for Bloom-Bolling Stages
N1
(506 kg ha−1)
PK-M1100%0%
PK-M225%75%
PK-M350%50%
PK-M475%25%
N2
(402.5 kg ha−1)
PK-M1100%0%
PK-M225%75%
PK-M350%50%
PK-M475%25%
N3
(299 kg ha−1)
PK-M1100%0%
PK-M225%75%
PK-M350%50%
PK-M475%25%
N4
(195.5 kg ha−1)
PK-M1100%0%
PK-M225%75%
PK-M350%50%
PK-M475%25%
Table 2. Main parameters of UAV editions for the cotton seasons.
Table 2. Main parameters of UAV editions for the cotton seasons.
Edition of UAVCamera ModelImage SizeMaxium Flight Time/minImage FormatWeight/g
DJI Mavic proFC2204000 × 300027JPEG743
Table 3. Processing steps with corresponding parameter settings in Agisoft Photoscan software for generation of orthophotos from UAV imagery.
Table 3. Processing steps with corresponding parameter settings in Agisoft Photoscan software for generation of orthophotos from UAV imagery.
TaskParameter Setup
Aligning imageAccuracy: high; Pair selection: generic; Key points: 40,000; Tie points: 4000
Building meshSurface type: height field; Source data: dense cloud; Face count: high
Positioning a guided markerManual positioning of markers on the even 11 GCPs for all the photos
Optimizing camerasDefault settings
Building dense point cloudsQuality: high; Depth filtering: mild
Building DEMSurface: mesh; Other parameters: default
Building orthomosaicSurface: mesh; Other parameters: default
Table 4. Summary of field campaigns for the cotton seasons.
Table 4. Summary of field campaigns for the cotton seasons.
YearDate of UAV FlightsDate of Field SamplingGrowth Stage
201824 June23 JunePeak squaring
13 July12 JulyPeak flowering
2 August2 AugustEarly peak boll-forming
12 August12 AugustLate peak boll-forming
23 August22 AugustBoll-opening
201925 June24 JunePeak squaring
19 July19 JulyPeak flowering
6 August5 AugustEarly peak boll-forming
20 August19 AugustLate peak boll-forming
23 September23 SeptemberBoll-opening
202014 June14 JunePeak squaring
4 July3 JulyPeak flowering
27 July26 JulyEarly peak boll-forming
2 August2 AugustLate peak boll-forming
17 September16 SeptemberBoll-opening
Table 5. Summary of vegetation indices derived from aerial orthophotos for the estimation of leaf nitrogen content and plant nitrogen content in cotton.
Table 5. Summary of vegetation indices derived from aerial orthophotos for the estimation of leaf nitrogen content and plant nitrogen content in cotton.
Variable CodingIndexFormulaReferences
V1RDN values of the R band-
V2GDN values of the G band-
V3BDN values of the B band-
V4rR/(R + G + B)[29]
V5gG/(R + G + B)[29]
V6bB/(R + G + B)[29]
V7GRRIG/R[30]
V8GBRIG/B[30]
V9RBRIR/B[30]
V10ARG(R + G + B)/3-
V11GRVI(G − R)/(G + R)[31]
V12NDI(G − R)/(G + R)[32]
V13WI(G − B)/ABS(R − G)[29]
V14IKAW(R − B)/(R + B)[33]
V15GLI(2 × G − R − B)/(2 × G + R + B)[34]
V16VARI(G − R)/(G + R − B)[35]
V17ExR1.4 × r − g[36]
V18ExG2 × g − r − b[36]
V19ExB1.4 × b − g[32]
V20ExGRExG − ExR[37]
V21VEGg/(ra × b(1 − a)) where a = 0.667[38]
V22CIVE0.441 × R − 0.881 × G + 0.385 × B + 18.78745[39]
V23MGRVIG2 − R2/G2 + R2[40]
V24RGBVI(G2 − B × R)/(G2 + B × R)[40]
V25RGRIR/G[41]
V26r/br/b-
V27r − br − b-
V28r + br + b-
V29g − bg − b-
V30(r − g − b)/(r + g)(r − g − b)/(r + g)-
Table 6. Basic statistics of the nitrogen content in cotton at different growth stages.
Table 6. Basic statistics of the nitrogen content in cotton at different growth stages.
ParametersStageMin
(g/kg)
Max
(g/kg)
Mean
(g/kg)
CV
%
LNCPeak squaring40.7359.1149.75 ± 3.847.73
Peak flowering31.1152.4644.54 ± 4.079.13
Early peak boll-forming28.4148.3538.65 ± 4.3111.15
Late peak boll-forming21.7441.4633.41 ± 3.8411.49
Boll-opening16.8039.5827.20 ± 4.2815.72
PNCPeak squaring32.0649.8940.21 ± 3.518.73
Peak flowering26.5740.3734.57 ± 3.319.57
Early peak boll-forming19.8235.5028.32 ± 3.3611.87
Late peak boll-forming14.0832.3723.60 ± 3.8516.30
Boll-opening6.0826.8317.48 ± 4.0022.91
Table 7. Analysis of variance for leaf nitrogen content and plant content at nitrogen treatments and phosphorus-potassium managements (PK-Ms) during 2018–2020.
Table 7. Analysis of variance for leaf nitrogen content and plant content at nitrogen treatments and phosphorus-potassium managements (PK-Ms) during 2018–2020.
TreatmentLNCPNC
Year****
N****
PK-Ms****
Year × N****
Year × PK-Ms****
N × PK-Ms***
Year × N × PK-Msnsns
Note: ** means significant at the level of 0.01’ ; * means significant at the level of 0.01; ns means not significant.
Table 8. Basic statistics of the R, G, and B values in cotton at different growth stages.
Table 8. Basic statistics of the R, G, and B values in cotton at different growth stages.
Band StageMinMaxMeanCV
(%)
RPeak squaring77.20114.9994.74 ± 7.057.45
Peak flowering73.53126.27102.29 ± 9.279.06
Early peak boll-forming97.24134.89113.94 ± 8.087.09
Late peak boll-forming105.31143.29124.56 ± 9.067.27
Boll-opening113.44140.10125.71 ± 5.644.48
GPeak squaring89.85166.29121.49 ± 14.2911.76
Peak flowering72.79170.78120.81 ± 17.6814.63
Early peak boll-forming98.13179.96132.6 ± 15.2211.48
Late peak boll-forming110.54183.46141.88 ± 13.929.81
Boll-opening114.71144.92126.63 ± 6.014.75
BPeak squaring70.33104.8986.50 ± 6.397.39
Peak flowering65.70114.8692.08 ± 8.309.02
Early peak boll-forming87.82124.12103.66 ± 7.357.09
Late peak boll-forming97.76132.78115.08 ± 8.837.68
Boll-opening112.83140.96124.97 ± 5.634.50
Table 9. Cotton nitrogen content estimates derived using UAV imagery and vegetation indices combined with three machine learning techniques and stepwise multiple linear regression.
Table 9. Cotton nitrogen content estimates derived using UAV imagery and vegetation indices combined with three machine learning techniques and stepwise multiple linear regression.
Nitrogen ContentStageRFSVMBPSMLR
R2RMSErRMSE
(%)
R2RMSErRMSE
(%)
R2RMSErRMSE
(%)
R2RMSErRMSE
(%)
LNCPeak squaring0.90011.23092.460.87031.38832.780.75661.90883.820.86641.39732.79
Peak flowering0.79671.86504.160.72612.19544.900.71232.21814.950.76381.94684.34
Early peak boll-forming0.74302.26895.840.70332.39576.160.59393.06867.890.71192.32285.98
Late peak boll-forming0.67852.34776.980.67172.46127.320.63922.38047.080.67442.17976.48
Boll-opening0.49643.286611.950.51913.278711.920.49463.532612.850.49279.151733.28
All stage0.72494.874112.510.66015.436213.950.64016.297016.160.67975.106413.10
PNCPeak squaring0.87571.23243.060.85561.33543.310.74681.76634.380.83561.40803.49
Peak flowering0.70941.78745.160.69301.89205.460.68961.85535.360.68711.82175.26
Early peak boll-forming0.67812.00327.080.57582.12817.520.55962.49518.790.61471.98527.02
Late peak boll-forming0.64612.459810.400.60522.33049.850.48162.683411.350.62202.28439.66
Boll-opening0.41383.413819.360.49302.857716.210.43252.975116.870.47822.830516.05
All stage0.70035.629819.470.63816.671523.080.63635.346818.490.66485.501919.03
Note: The total number of samples was 480 for all stages and 96 for each growth stage.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Y.; Chen, Y.; Wen, M.; Lu, Y.; Ma, F. Accuracy Comparison of Estimation on Cotton Leaf and Plant Nitrogen Content Based on UAV Digital Image under Different Nutrition Treatments. Agronomy 2023, 13, 1686. https://doi.org/10.3390/agronomy13071686

AMA Style

Liu Y, Chen Y, Wen M, Lu Y, Ma F. Accuracy Comparison of Estimation on Cotton Leaf and Plant Nitrogen Content Based on UAV Digital Image under Different Nutrition Treatments. Agronomy. 2023; 13(7):1686. https://doi.org/10.3390/agronomy13071686

Chicago/Turabian Style

Liu, Yang, Yan Chen, Ming Wen, Yang Lu, and Fuyu Ma. 2023. "Accuracy Comparison of Estimation on Cotton Leaf and Plant Nitrogen Content Based on UAV Digital Image under Different Nutrition Treatments" Agronomy 13, no. 7: 1686. https://doi.org/10.3390/agronomy13071686

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop