Paper The following article is Open access

An Empirical Study for the Deep Learning Models

, and

Published under licence by IOP Publishing Ltd
, , Citation Monika Sethi et al 2021 J. Phys.: Conf. Ser. 1950 012071 DOI 10.1088/1742-6596/1950/1/012071

1742-6596/1950/1/012071

Abstract

Deep Learning (DL) models have tested to be very powerful in solving many hard problems. Especially, those are related to computer vision, text, speech, and classification. However, the blueprint of such models requires large space and elaboration that needs to be examined. Convolutional Neural Network (CNN) is the most popular neural network that can extract the features automatically as compared to conventional machine learning algorithms (CMLA). Our aim in this paper is to lessen the human attempt required to layout architectures by the use of a gadget architecture development process that allows the exploration of huge design space by automating sure version construction, alternative generation, and assessment. The main operations in CNN are Convolution, Pooling, Flattening, Full Connection between the input and output layer. The dataset taken as CIFAR 10 having 60,000 color images of 10 different classes is considered for the study where, various classes represent the images of cars, trucks, frogs, horses, trucks, cats, cars, airplanes, ships, and deer. It is expected that the performance of the CNN model can be further improved by using the deeper network architecture, or by an increasing number of epochs or data augmentation. In this paper, an attempt has been made to explain simple and deeper CNN models on the CIFAR 10 dataset and the comparison has been carried out to check the accuracy achieved from both the models.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1950/1/012071