Given a differentiable objective function $ f:\mathbb{R}^n \to \mathbb{R} $, gradient descent generates a sequence of iterates by the update rule: