At the end of an epoch of the gradient descent algorithm, a weight w0 is updated...
✅ The verified answer to this question is available below. Our community-reviewed solutions help you understand the material better.
At the end of an epoch of the gradient descent algorithm, a weight w0 is updated to a new value of 1.75. The errorDelta value at the end of the epoch was 2.5 and a learning rate of 0.1 was used. What was the previous value of w0 before the update?