The Optimization of cost in any product is plays the important role. The Gradient descent method is used to solve the problem of local minimization. This is an Optimizing algorithm that also used for minimize the cost function in various machine learning. Basically we discussed the types of Gradient descent algorithm like Batch gradient descant, Stochastic gradient descant and Mini batch descant. This paper will be discussed on the different parts of Gradient descant and stochastic gradient descant algorithm. This paper is the comparative study of both algorithms.

1.
S.
Liu
and
Y.
Takaki
,
Appl. Sci.
10
,
4283
(
2020
).
2.
A.C.
Kara
and
F.
Hardalac
,
Mach. learn. knowl. extr.
3
,
1009
1029
(
2021
).
3.
F. F.
Lopes
,
J. C.
Ferrcira
and
M.
Fernandes
,
Electronics
8
,
631
(
2019
).
4.
G. S.
Sharma
,
Int. J. Curr. Adv. Res
10
,
25642
25644
(
2021
).
5.
W. Z
Geem
and
Y.S.
Chung
,
Complexity
8
, (
2018
). .
6.
M.
Farhaoui
,
Water Resour. Prot.
8
,
777
786
(
2016
).
7.
P.
Baladi
,
IEEE Trans Neural Netw Learn syst.
6
,
182
195
(
1995
).
8.
S. H.
Haji
and
A. M.
Abdulazeez
,
PalArch’s J. Archaeol. Egypt/ Egyptol
18
,
2715
2743
(
2021
).
9.
Z.
Kaoudi
and
S.
Chawla
, “
A cost based Optimizer for gradient descent optimizer
” in
Proceedings of the 2017 ACM International Conference on Management of Data, Qatar Computing Research Institute, HBKU
(
Association for Computing Machinery
,
New York, NY, United States
,
2017
).
10.
G.
Li
,
S.
Li
and
M.
You
,
Math Methods Oper Res.
70
,
1
17
(
2020
).
11.
R. L.
Bot
and
E. R.
Csetnek
,
Optim. Lett.
66
,
1383
1396
(
2017
).
12.
L. G.
Drummond
and
B.
Svaiter
,
J. Comput. Appl. Math.
175
,
229
238
(
2005
).
This content is only available via PDF.
You do not currently have access to this content.