PERFORMANCE OF HAF IN ATTENTION-BILSTM FOR PREDICTING THE QUALITY OF AUTOMOBILE ANCILLARY SUPPLIERS

,


INTRODUCTION
Deep learning [15] is a category of machine learning with focus on greater flexible dataset with hierarchy of concepts. Deep learning trains the data where input is given and output is extracted. These concepts were applied for supervised learning and unsupervised learning of data. To implement the Deep learning algorithm for the complex problems takes a lot of time but Automobile industry is one of the core industries in Indian Economy [14]. Auto ancillary products are the backbone for Automobile vehicles. Either two-wheeler or four-wheeler needs lot 4303 PERFORMANCE OF HAF IN ATTENTION-BILSTM of spare parts to assemble a vehicle or after sales for the vehicles to be maintained. To analyze the Auto Ancillary products on the basis of cost, quality and delivery plays a sustainable feedback from the customers and users. To be more specific quality is very important and each customer expects the maximum output from a vehicle as well as Auto Ancillary products. To maintain the quality of Auto ancillary products, the Manufacturers check the quality of raw materials from suppliers. During the production phase Auto Ancillary products were produced with at most care and with good quality products. One aspect to be considered while concentrating on quality is the raw materials from the Supplier.
Suppliers of the raw material have to ensure a good quality or base for Auto Ancillary products. Raw materials play a vital role in quality sustenance and maintenance. Supplier has to produce good quality raw materials starting from a small piece to a bigger one. The dataset comprises the checking of quality at the supplier site and received the raw materials and checked at the Auto Ancillary Company also. The Supplier dataset consists of vendor test report, fitment, and process document details which give information about supplier raw materials that will be tested at their place. Raw materials follow the guidelines of the organization and tested for design, fitment and process to maintain the quality in the products. Thus the supplier dataset was taken to the novel model to classify and analyze the data. The prediction of suppliers will improve and maintain to 100% in terms of quality. Deep learning [4] is the recent field for machine learning techniques for data representation. Building of huge data visually with hidden layers for improving the performance leads to a better model. Alex [1] proposes the difference between unidirectional and bi directional long short term memory which was applied for speech recognition and online data for classification. Input pattern using Bi directional LSTM [2] was detected for non-repeated words and characters. The Author has proposed to extract pattern matching features with improved performances.

LITERATURE WORK
A Neural model was suggested by Oren [2] for NLP tasks such as word sensing, entity recognition etc. Wenhui [3]authored graph-based dependency parsing using neural network model. This model uses Bidirectional LSTM (BLSTM) to standardize contextual information.
Many activation functions exists and Guifany [4] proposed improved ReLU segmentation piecewise for building a better output in convolutional neural networks. Hock Huany [5] suggested an improvised ReLU activation function for deep learning datasets which shows good performance results. Julius [6] proposed how ReLU activation functions could be used in neural networks to solve optimization problems and the loss function.
ReLU activation function was used by Hidenor [7] in the hidden layers for sparse regularization for convolutional neural network to make the inputs as zero in the training process and unwanted increase of the output can be reduced. Armenak [8] proposes a formula for integration for shallow neural network with ReLU activation functions for constructing and training the target function. Weng [9] proposed a unique structure of ReLU networks to provide two computationally efficient algorithms Fast-Lin, Fast-Lip to verify Robustness property of ReLU networks. General deep ReLU neural networks to the modulus function were proposed [10] to learn a better unique model. With these research papers referred the importance and use of ReLU activation function in variety of datasets and fields. Xiao [11] proposed to create a hidden layer for neural networks using ReLU activation function. This paved a way for creating multiple neurons with better results. Thus the ReLU activation was used for making activation function as one of the hidden layer.
Objectives as follows:

EXISTING WORK
The existing work available was tanh-activation function in Attention based BiLSTM model for prediction. The previous work was done with the basis of Bidirectional LSTM with Attention layer and modified tanh function for Export dataset. The previous work does not suit for this data as it has multi class labels. The data set had mostly categorical words which need a unique model to analyse and predict. The existing work was not suited as the dataset framework was entirely different, so there was a need for proposed work. Modified tanh function with BiLSTM showed good performance for Export dataset. Step

3: Set cell dimensions for RNN and features
Step 4: To define the model, set the units for LSTM, embedding, input and output.
Step 5: Bi-LSTM layer formation to calculate the integrated sequence of inputs

➢ Forward and backward LSTM Layer Concatenation based as defined below; c(t) = f(t) ⊕ b(t)
Step 6: Set the hybrid tanh (HAF) as activation function for all layers except output layer f = 0.5z * tanh (relu(z)) Step 7: Compute Attention layer

Run the input data through all layers Train the Network (N)
End for Step 11: Loss calculation = − 1 ∑ .

Pre-processing
Pre-processing of any data is the first step towards the analysis. The data was examined for the size and the parameters available. There are many pre-processing steps to be carried out.
It depends upon the data set structures. There were missing values and it should be replaced by the apt values available in the dataset using fill forward method. Using this method the last valid data is updated to the missing values. The dataset consists of categorical values, numbers etc.
The categorical values as words are isolated and transform those values to suitable numbers using Label encoder function. Label encoder function converts labels or words to numeric form into a machine readable form. Duplicates were removed to have unique values for analysis and convert to comma separated file for classification.

Hybrid Activation Function in Attention based BiLSTM
In the proposed model bidirectional LSTM was the base for classification and prediction.
The Long short term memory algorithms internalize the concept of Artificial Neural Networks. The

Results and discussion
The proposed work shows various results to compare the existing work with the novel work.
HAF in Attention based Bidirectional LSTM show better performance compared to the existing work. The comparison shows with various metrics, to highlight a few metrics which was considered for this study was accuracy, AUC, Precision, Loss, and Recall.   with less time. Accuracy is calculated based on the ratio of correctly predicted data by total number of data available. Figure 2 shows the accuracy of 92% with existing model. Figure 3 show the accuracy of 94% with proposed model. Area under Curve (AUC) is a metric which summarizes the convergence of the curve. AUC of the model represents the measure of the values. The calculation is done using trapezoidal rule. If the AUC value is above 0.5, the model is better. If the AUC value is above 8, it is acceptable and if it is above 9 the model is represented as excellent. The higher the value of AUC shows better performance of the model.   Ancillary Manufacturers. Suppliers provide Raw materials to the product Manufacturers. Raw material has to maintain standard procedures for the quality and delivery aspects. If there was a quality compromise, then the products manufactured would not give the quality expected by the customers. This is like a chain reaction for the quality maintenance and delivery logistics. The supplier raw materials would be a one of deciding factor for maintaining quality standards. The proposed model applies the hybrid of Activation functions such as tanh and ReLU to analyze the performance of the model. This model showed better performance comparing the existing model.
The Proposed Model showed 94% of the Accuracy and increase in other metrics such as Precision, Recall and Area under curve. These metrics showed the increase in the performance compared with existing work.