logo

Crowdly

Browser

Add to Chrome

Which of the following statement(s) is / are true for Gradient Descent (GD) and...

✅ The verified answer to this question is available below. Our community-reviewed solutions help you understand the material better.

Which of the following statement(s) is / are true for Gradient Descent (GD) and Stochastic Gradient Descent (SGD)?

  1. In GD and SGD, you update a set of parameters in an iterative manner to minimize the error function.
  2. In SGD, you must run through all the samples in your training set for a single parameter update in each iteration.
  3. In GD, you either use the entire data points or a subset of training data to update a parameter in each iteration.

More questions like this

Want instant access to all verified answers on moodle.elct.lnu.edu.ua?

Get Unlimited Answers To Exam Questions - Install Crowdly Extension Now!

Browser

Add to Chrome