Briefing Cloudera Knowledge

what two conditions does stochastic gradient descent outperform 2nd-order optimization techniques su

Under what two conditions does stochastic gradient descent outperform 2nd-order optimization
techniques such as iteratively reweighted least squares?

A.
When the volume of input data is so large and diverse that a 2nd-order optimization technique
can be fit to a sample of the data

B.
When the model’s estimates must be updated in real-time in order to account for
newobservations.

C.
When the input data can easily fit into memory on a single machine, but we want to calculate
confidence intervals for all of the parameters in the model.

D.
When we are required to find the parameters that return the optimal value of the objective
function.