A) Batch gradient descent updates the parameters for every training example, while stochastic gradient descent updates them after each batch of training examples
B) Batch gradient descent updates the parameters after each batch of training examples, while stochastic gradient descent updates them after every training example
C) Batch gradient descent updates the parameters after each batch of training examples, while stochastic gradient descent updates them after every training example
D) There is no difference; both methods update the parameters at the same frequency