Deep learning is an area emerged by the capabilities of machine learning and artificial intelligence. In the field ofdeep learning, various parameters are essential to building an efficient model. These parameters include activation function,loss function, number of layers, number of neurons, and many other essentials. Finding an optimal workable environment bysetting all these variables is really time consuming and challenging due time constraints or projects having sharp deadlines. In this paper, the work is done for the exploration of these parameters that make an environment in which deep learning models work. The experimental results will also be discussed to learn about the best environment that make text classificationoptimal in deep learning. These results would help the professions and researchers to extract a ready to work environment for the projects associated with the text classification.

1.
Rasamoelina
,
A. D.
,
Adjailia
,
F.
, &
Sinčák
,
P.
(
2020
, January).
A review of activation function for artificial neural network
.
In 2020 IEEE 18th World Symposium on Applied Machine Intelligence and Informatics (SAMI)
(pp.
281
286
). IEEE.
2.
Szandała
,
T.
(
2021
).
Review and comparison of commonly used activation functions for deep neural networks
. In
Bio-inspired neurocomputing
(pp.
203
224
). Springer, Singapore.
3.
Worked upon image data classification and shown to achieve greater results with swish activation function
.
4.
Zhang
,
X.
,
Chang
,
D.
,
Qi
,
W.
, &
Zhan
,
Z.
(
2021
).
A Study on Different Functionalities and Performances among Different Activation Functions across Different ANNs for Image Classification
.
In Journal of Physics: ConferenceSeries
(Vol.
1732
, No.
1
, p.
012026
).
IOP Publishing
.
5.
Choi
,
D.
,
Shallue
,
C. J.
,
Nado
,
Z.
,
Lee
,
J.
,
Maddison
,
C. J.
, &
Dahl
,
G. E.
(
2019
).
On empirical comparisons ofoptimizers for deep learning
. arXiv preprint arXiv:1910.05446.
6.
Zaman
,
S. M.
,
Hasan
,
M. M.
,
Sakline
,
R. I.
,
Das
,
D.
, &
Alam
,
M. A.
(
2021
, December).
A Comparative Analysis of Optimizers in Recurrent Neural Networks for Text Classification
. In
2021 IEEE Asia-Pacific Conference on Computer Science and Data Engineering (CSDE
) (pp.
1
6
). IEEE.
7.
Xu
,
J.
,
Wang
,
X.
,
Feng
,
B.
, &
Liu
,
W.
(
2020
).
Deep multi-metric learning for text-independent speaker verification
.
Neurocomputing
,
410
,
394
400
.
8.
Gajowniczek
,
K.
,
Liang
,
Y.
,
Friedman
,
T.
,
Ząbkowski
,
T.
, &
Van den Broeck
,
G.
(
2020
).
Semantic and generalizedentropy loss functions for semi-supervised deep learning
.
Entropy
,
22
(
3
),
334
.
9.
Mutabazi
,
E.
,
Ni
,
J.
,
Tang
,
G.
, &
Cao
,
W.
(
2021
).
A review on medical textual question answering systems based ondeep learning approaches
.
Applied Sciences
,
11
(
12
),
5456
.
10.
Mai
,
F.
,
Tian
,
S.
,
Lee
,
C.
, &
Ma
,
L.
(
2019
).
Deep learning models for bankruptcy prediction using textualdisclosures
.
European journal of operational research
,
274
(
2
),
743
758
.
11.
Sun
,
T.
, &
Vasarhelyi
,
M. A.
(
2018
).
Embracing textual data analytics in auditing with deep learning
.
InternationalJournal of Digital Accounting Research
,
18
.
12.
Mohan
,
S.
,
Fiorini
,
N.
,
Kim
,
S.
, &
Lu
,
Z.
(
2018
, April).
A fast deep learning model for textual relevance in biomedical information retrieval
.
In Proceedings of the 2018 World Wide Web Conference
(pp.
77
86
).
13.
Peng
,
S.
,
Cao
,
L.
,
Zhou
,
Y.
,
Ouyang
,
Z.
,
Yang
,
A.
,
Li
,
X.
, … &
Yu
,
S.
(
2021
).
A survey on deep learning for textualemotion analysis in social networks
.
Digital Communications and Networks.
14.
Avinash
Sharma
,
Understanding Activation Functions in Neural Networks
(Nov,
2018
), retrieved from: https://medium.com/the-theory-of-everything/understanding-activation-functions-in-neural-networks-9491262884e0
15.
O.
Sharma
, "
Deep challenges associated with deep learning
",
Proc. Int. Conf. Mach. Learn. Big Data Cloud ParallelComput.
(COMITCon), pp.
72
75
, Feb.
2019
.
16.
Ranganathan
,
H.
,
Venkateswara
,
H.
,
Chakraborty
,
S.
, &
Panchanathan
,
S.
(
2017
, September).
Deep active learning forimage classification
. In
2017 IEEE International Conference on Image Processing (ICIP
) (pp.
3934
3938
). IEEE.
17.
Janocha
,
K.
, &
Czarnecki
,
W. M.
(
2017
).
On loss functions for deep neural networks in classification
. arXiv preprint arXiv:1702.05659.
18.
Minaee
,
S.
,
Kalchbrenner
,
N.
,
Cambria
,
E.
,
Nikzad
,
N.
,
Chenaghlu
,
M.
, &
Gao
,
J.
(
2020
).
Deep learning-based text classification: A comprehensive review
. arXiv preprint arXiv:2004.03705.
19.
Martin
Thoma
. (
2017
).
The Reuters Database
. retrieved from https://martin-thoma.com/nlp-reuters on 10-10-2022.
20.
Roberts
,
C.
, &
Leichenauer
,
S.
(
2020
).
Introducing tensornetwork, an open source library for efficient tensor calculations
.
Google AI Blog
,
2019
.
21.
Dan
Becker
. (
2022
).
Layer Activation Functions
. retrieved from https://keras.io/activations/ on 10-10-2022.
22.
Dan
Becker
. (
2022
).
Introduction to Deep Learning in Python
. retrieved from https://www.datacamp.com/courses/deep-learning-in-python on 10-20-2022.
This content is only available via PDF.
Published by AIP Publishing.
You do not currently have access to this content.