首页 > 编程 > Python > 正文

PyTorch线性回归和逻辑回归实战示例

2020-02-23 00:11:12
字体:
来源:转载
供稿:网友

线性回归实战

使用PyTorch定义线性回归模型一般分以下几步:

1.设计网络架构
2.构建损失函数(loss)和优化器(optimizer)
3.训练(包括前馈(forward)、反向传播(backward)、更新模型参数(update))

#author:yuquanle#data:2018.2.5#Study of LinearRegression use PyTorchimport torchfrom torch.autograd import Variable# train datax_data = Variable(torch.Tensor([[1.0], [2.0], [3.0]]))y_data = Variable(torch.Tensor([[2.0], [4.0], [6.0]]))class Model(torch.nn.Module):  def __init__(self):    super(Model, self).__init__()    self.linear = torch.nn.Linear(1, 1) # One in and one out  def forward(self, x):    y_pred = self.linear(x)    return y_pred# our modelmodel = Model()criterion = torch.nn.MSELoss(size_average=False) # Defined loss functionoptimizer = torch.optim.SGD(model.parameters(), lr=0.01) # Defined optimizer# Training: forward, loss, backward, step# Training loopfor epoch in range(50):  # Forward pass  y_pred = model(x_data)  # Compute loss  loss = criterion(y_pred, y_data)  print(epoch, loss.data[0])  # Zero gradients  optimizer.zero_grad()  # perform backward pass  loss.backward()  # update weights  optimizer.step()# After traininghour_var = Variable(torch.Tensor([[4.0]]))print("predict (after training)", 4, model.forward(hour_var).data[0][0])

迭代十次打印结果:

0 123.87958526611328
1 55.19491195678711
2 24.61777114868164
3 11.005026817321777
4 4.944361686706543
5 2.2456750869750977
6 1.0436556339263916
7 0.5079189538955688
8 0.2688019871711731
9 0.16174012422561646
predict (after training) 4 7.487752914428711

loss还在继续下降,此时输入4得到的结果还不是预测的很准

当迭代次数设置为50时:

0 35.38422393798828
5 0.6207122802734375
10 0.012768605723977089
15 0.0020055510103702545
20 0.0016929294215515256
25 0.0015717096393927932
30 0.0014619173016399145
35 0.0013598509831354022
40 0.0012649153359234333
45 0.00117658288218081
50 0.001094428705982864
predict (after training) 4 8.038028717041016

此时,函数已经拟合比较好了

再运行一次:

0 159.48605346679688
5 2.827991485595703
10 0.08624256402254105
15 0.03573693335056305
20 0.032463930547237396
25 0.030183646827936172
30 0.02807590737938881
35 0.026115568354725838
40 0.02429218217730522
45 0.022596003487706184
50 0.0210183784365654
predict (after training) 4 7.833342552185059

发现同为迭代50次,但是当输入为4时,结果不同,感觉应该是使用pytorch定义线性回归模型时:
torch.nn.Linear(1, 1),只需要知道输入和输出维度,里面的参数矩阵是随机初始化的(具体是不是随机的还是按照一定约束条件初始化的我不确定),所有每次计算loss会下降到不同的位置(模型的参数更新从而也不同),导致结果不一样。

发表评论 共有条评论
用户名: 密码:
验证码: 匿名发表