当前位置首页 > 百科资料> 正文

梯度下降算法

2022-07-12 06:50:14 百科资料

简介

  VB梯度下降算法

  function grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

  xga(1)= xstart;

  yga(1)= ystart;

  zga(1)=func(xga(1),yga(1));

  for i=1:N

  gradx = ( func(xga(i)+eps,yga(i))-func(xga(i),yga(i)) )/eps;

  grady = ( func(xga(i),yga(i)+eps)-func(xga(i),yga(i)) )/eps;

  xga(i+1) = xga(i) + mu*gradx;

  yga(i+1) = yga(i) + mu*grady;

  zga(i+1)=func(xga(i+1),yga(i+1));

  end

  hold off

  contour(x,y,z,10)

  hold on

  quiver(x,y,px,py)

  hold on

  plot(xga,yga)

  S = sprintf('Gradiant Ascent: N = %d, Step Size = %f',N,mu);

  title(S)

  xlabel('x axis')

  ylabel('yaxis')

  DEMO

  clear

  print_flag = 1;

  width = 1.5;

  xord = -width:.15:width;

  yord = -width:.15:width;

  [x,y] = meshgrid(xord,yord);

  z = func(x,y);

  hold off

  surfl(x,y,z)

  xlabel('x axis')

  ylabel('yaxis')

  if print_flag, print

  else, input('Coninue?'), end

  [px,py] = gradient(z,.2,.2);

  xstart = 0.9*width;

  ystart =-0.3*width;

  N = 100;

  mu = 0.02;

  grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

  if print_flag, print

  else, input('Coninue?'), end

  N = 100;

  mu = 0.06;

  grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

  if print_flag, print

  else, input('Coninue?'), end

  N = 100;

  mu = 0.18;

  grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)

  if print_flag, print

  else, input('Coninue?'), end

声明:此文信息来源于网络,登载此文只为提供信息参考,并不用于任何商业目的。如有侵权,请及时联系我们:baisebaisebaise@yeah.net