1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Stochastic gradient descent for a function of multiple variables?

Discussion in 'Mathematics' started by Str91, Aug 1, 2020.

  1. Str91

    Str91 Guest

    I realize SGD is used for large data sets where the iterative solution could be approximated by a random sample's gradient instead of the sum over all samples. My question is suppose I have a function of mutiple variables say 'd', i,e just one sample, could one use stochastic gradient descent for just the function? I am asking this because in one of the homework problems in Gilbert Strang's Data science course asks you to compute a single step of gradient descent for a function of two variables, it is explicitly mentioned, full gradient descent not stochastic? I wonder why?

    Login To add answer/comment
     

Share This Page