COMPARISON OF BACKPROPAGATION AND ERNN METHODS IN PREDICTING CORN PRODUCTION

East Java is one of the producers of food crops in Indonesia. Some food crop commodities in East Java Province are corn, soybeans, peanuts, sweet potatoes, and cassava. These food crops have many benefits to make the demand for production increase. The uncertain amount of food crop production will be a problem for the Department of Agriculture and Food Security of East Java Province in determining a policy. To overcome this problem, a system is needed to predict the production of food crops in East Java. This study compares the Backpropagation algorithm and Elman Recurrent Neural Networks (ERNN). The data in this study were obtained from the Department of Agriculture and Food Security of East Java Province starting from 2007-2020 per quarter. The result of this research is that trial scenario 1 produces the best MSE value of 0.00000063 on the Backpropagation algorithm compared to ERNN which only gets an MSE value of 0.00000627. Trial scenario 2 produces the best MSE value, which is 0.000000003 in the Backpropagation algorithm with gradient descent momentum, this is also better when compared to ERNN which gets an MSE value of 0.00000407. It can be concluded that the best algorithm in this study is 2 PUTRO, SYAKUR, ROCHMAN, HUSNI, USFIROTUMMAMLU’AH, RACHMAD Backpropagation with gradient descent momentum because it produces MSE values with good prediction results from all algorithms compared.

Several Artificial Neural Network methods that can be used for prediction are the Backpropagation algorithm and Elman Recurrent Neural Network (ERNN). Research using Backpropagation Neural Network and Elman Recurrent Neural Network (ERNN) has been carried out for forecasting cement sales at PT Semen Indonesia (Persero) Tbk using monthly sales data from January 2006 to March 2018. The results of this study state that the best model for the system is cement forecasting at PT Semen Indonesia (Persero) Tbk, namely Backpropagation Neural Network [7].
In addition, predictive research that compares 3 methods at once, namely Exponential Smoothing, Backpropagation, and Elman Recurrent Neural Network, has been carried out for the prediction of palawija production using 7 datasets. The results of this study indicate that the Elman Recurrent Neural Network is the best method for predicting palawija production [8].
In some of the prediction studies above that use the Backpropagation algorithm and Elman Recurrent Neural Networks (ERNN), some show Backpropagation is better than Elman Recurrent Neural Networks (ERNN) but there is also the opposite. Therefore, in this study, we will compare the Backpropagation and Elman Recurrent Neural Networks (ERNN) algorithms.
The main contribution of this research is to predict maize production using Backpropagation Algorithm Artificial Neural Networks and Elman Recurrent Neural Networks (ERNN) as well as adding Gradient Descent momentum (GDM) optimization to the training algorithm.

PRELIMINARIES
The forecasting process is forecasting an event or something in the future by using the relevant data or variables. The forecasting process uses data in the past which is processed by using an algorithm and based on a theoretical explanation so that it is not only considered as a guess but has 4 PUTRO, SYAKUR, ROCHMAN, HUSNI, USFIROTUMMAMLU'AH, RACHMAD been based on a strong theory [9].
Forecasting or prediction can also be interpreted as an attempt to predict something in the future by using data in the past (historical data) within a certain period and utilizing relevant information [11].

A. Artificial Neural Network
An artificial Neural Network or commonly called Artificial Neural Network is an algorithm that imitates human neural networks [10]. Like the human brain, the ANN consists of neurons that are interconnected to deliver an impulse. The structure of the human nerve cell consists of the soma (nerve cell body) which is the site of synthesis and integration of nerve impulses. Dendrites have a function to collect all messages to be sent to the nerve cell body (input, processing). The axon (neurite) has the function of carrying the nerve impulse to all other nerve cells (output). This can be seen in Figure 1.

B. Backpropagation
The typical topology of the Backpropagation algorithm has 3 layers, namely the input layer where the data is entered. The hidden layer is where the data is processed and the output layer is where the results of the input are given [11]. Before getting to know the Backpropagation algorithm.
So it would be better if you can know the architecture of the Backpropagation algorithm. Figure 3 shows the Backpropagation Algorithm Architecture. V that connects the two The layer is (1x5=5). If there is a bias between the input layer and the hidden layer, the total weight of V is ((1+1) x 5 = 10). This also applies to the weights connecting the hidden layer and the output layer. Based on this, this study uses the architecture shown in table 1. The architecture of the number of layers and the neurons of the Backpropagation algorithm has 3 layers consisting of an input layer, a hidden layer, and an output layer. Each layer has 1 input neuron, 5 hidden neurons, and 1 output neuron.

While the Elman Recurrent Neural Networks (ERNN) algorithm is a development of the
Backpropagation ANN algorithm. The thing that distinguishes it from Backpropagation is that there is feedback on the hidden layer of ERNN. The result of this feedback is an additional layer called the context layer [12]. The ERNN architecture can be seen in Figure 4. which are used as additional input so that the number of neurons in the context layer is the same as the hidden layer [12].
There is a connection between the hidden layer and the context layer with a weight value of one.
The result of backward propagation in the context layer is the previous value of the hidden layer [13].   There are 2 processes in the Backpropagation and ERNN algorithms, namely:

Training Process
In the Backpropagation algorithm, the training process can be interpreted as an iteration of the forward and backward propagation processes which aims to obtain trained weights and biases. The Elman Recurrent Neural Networks algorithm has a process that is almost the same as the Backpropagation algorithm, the difference is that in the Elman Recurrent Neural Networks algorithm there is feedback on the hidden layer and the result of this feedback is an additional layer called the context layer. The neurons in the context layer will be additional input to the training process.

Testing Process
This process is a process that is carried out on test data, in the Backpropagation and Elman Recurrent Neural Networks testing process only the forward propagation process is carried out. In the test process, the data testing process is carried out using the weights from the training process.
The weights generated in the training process are used for the system testing process. The test results are in the form of quarterly food crop predictions whose error value is calculated using Mean Square Error by comparing the target data and the predicted data.

C. Normalization and denormalization
In this study using the min-max normalization method, this method changes the data into a range of zero to one (0-1) [14]. Normalization aims to get data with a smaller size that represents the original data without losing its characteristics. The normalization formula can be seen from the equation below:

100%
(3) Where, is the average squared error, the value of is the actual data while is the predicted data and the amount of data is indicated by n.

A. Data Collection
The data obtained from the Department of Agriculture and Food Security of East Java Province consists of the last 14 years starting from 2007-2020 per quarter. Then choose the parameters that will be used as input data and determine the data that will be used as target data. In this study, there is the amount of production as an input parameter for prediction.

B. Testing of Backpropagation and ERNN methods
The trials in this study were carried out by dividing the training data from 70% to 90%, maximum iterations, and changes in the learning rate value. Therefore, the test scenario carried out is as shown in the table below. The number of layers used in the Backpropagation algorithm is 3 layers with 1 input neuron, 5 hidden neurons, and 1 output neuron. While the number of layers for the ERNN algorithm is 4 layers with 1 input neuron, 1 hidden neuron, 5 context neurons, and 1 output neuron. Table 5 will show the MSE value of the test changes carried out with different amounts of train and testing data and LR.  ERNN for corn production prediction data in the district. Bangkalan. Figure 10 shows the comparison of the MSE value between backpropagation and ERNN from the test results with a learning rate of 0.9

C. Backpropagation and ERNN trials with Gradient Descent optimization
The trial scenario at this stage is used to compare the Backpropagation gradient descent algorithm with momentum and ERNN gradient descent with momentum. The addition of this momentum value aims to reduce the significant weight changes. In the trial, the best backpropagation and ERNN were at a learning rate of 0.9, so in this trial, it will be reused without recombining other parameters. The second trial scenario can be seen in table 6.  for corn production prediction data in the district. Bangkalan. Figure 11 shows the comparison of the MSE value between backpropagation and ERNN with Gradient Descent optimization from the test results with a learning rate of 0.9.

CONCLUSION
Based on the system testing that has been carried out and the results of several research trial scenarios that have been carried out, it can be concluded that: