pytorch-6线性优化

1.可视化,meshgrid合并x,y

def fun(x):
    return (x[0]**2+x[1]-11)**2+(x[0]+x[1]**2-7)**2
x=np.arange(-6,6,0.1)
y=np.arange(-6,6,0.1)
X,Y=np.meshgrid(x,y)
print(X.shape)
print(Y.shape)
Z=fun([X,Y])

fig=plt.figure("123")
ax=fig.gca(projection="3d")
ax.plot_surface(X,Y,Z)
plt.show()

2.优化

x=torch.tensor([0.,0.],requires_grad=True)
optimizer=torch.optim.Adam([x],lr=1e-3)
for step in range(20000):
    pred=fun(x)
    optimizer.zero_grad()#Clears the gradients of all optimized :class:`torch.Tensor` s
    pred.backward()
    optimizer.step()
    if step%2000==0:
        print('step{}:x={},f(x)={}'.format(step,x.tolist(),pred.item()))

3.完整代码

import torch
from torch.nn import functional
import numpy as np
from mpl_toolkits.mplot3d import Axes3D
from matplotlib import pyplot as plt
def fun(x):
    return (x[0]**2+x[1]-11)**2+(x[0]+x[1]**2-7)**2
x=np.arange(-6,6,0.1)
y=np.arange(-6,6,0.1)
X,Y=np.meshgrid(x,y)
print(X.shape)
print(Y.shape)
Z=fun([X,Y])

fig=plt.figure("123")
ax=fig.gca(projection="3d")
ax.plot_surface(X,Y,Z)
plt.show()

x=torch.tensor([0.,0.],requires_grad=True)
optimizer=torch.optim.Adam([x],lr=1e-3)
for step in range(20000):
    pred=fun(x)
    optimizer.zero_grad()#Clears the gradients of all optimized :class:`torch.Tensor` s
    pred.backward()
    optimizer.step()
    if step%2000==0:
        print('step{}:x={},f(x)={}'.format(step,x.tolist(),pred.item()))

结果:(120, 120)
(120, 120)
step0:x=[0.0009999999310821295, 0.0009999999310821295],f(x)=170.0
step2000:x=[2.3331806659698486, 1.9540692567825317],f(x)=13.730920791625977
step4000:x=[2.9820079803466797, 2.0270984172821045],f(x)=0.014858869835734367
step6000:x=[2.999983549118042, 2.0000221729278564],f(x)=1.1074007488787174e-08
step8000:x=[2.9999938011169434, 2.0000083446502686],f(x)=1.5572823031106964e-09
step10000:x=[2.999997854232788, 2.000002861022949],f(x)=1.8189894035458565e-10
step12000:x=[2.9999992847442627, 2.0000009536743164],f(x)=1.6370904631912708e-11
step14000:x=[2.999999761581421, 2.000000238418579],f(x)=1.8189894035458565e-12
step16000:x=[3.0, 2.0],f(x)=0.0
step18000:x=[3.0, 2.0],f(x)=0.0

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值