Do gradient descent methods at all times converge to similar point?

Data Science Interview QuestionsCategory: Data ScienceDo gradient descent methods at all times converge to similar point?
3 Answers
MockInterview Staff answered 5 years ago

No, they do not because in some cases it reaches a local minima or a local optima point. You will not reach the global optima point. This is governed by the data and the starting conditions.
Source

Nikhil answered 7 months ago

Thank you for sharing such a good information. Very informative and effective post.
cyber security course

Ashish answered 4 weeks ago

This is not a question about local minima. There answer is Yes and No. They are never equal but quite close.  Batch GD: Optimum Solution SGD: Good but not optimum Bini-Batch GD: in-between the other two

Your Answer

16 + 10 =