栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 软件开发 > 后端开发 > Python

2021-11-12

Python 更新时间: 发布时间: IT归档 最新发布 模块sitemap 名妆网 法律咨询 聚返吧 英语巴士网 伯小乐 网商动力

2021-11-12

随机梯度下降法由于每次仅仅采用一个样本来迭代,训练速度很快

import numpy as np
import matplotlib.pyplot as plt
X = np.array([[2104,3],[1600,3],[2400,3],[1416,2],[3000,4]])
Y = np.array( [400,330,369,232,540])
theta0 = np.random.random()
theta1 = np.random.random()
theta2 = np.random.random()
epochs = 0.0001
alpha = 0.01
def cost(X,Y,theta0,theta1,theta2):
loss = 0
m = len(Y)
for i in range(m):
loss += (theta0+theta1X[i,0]+theta2X[i,1]-Y[i])**2
loss = loss/(2m)
return loss
def grad_des(X,Y,theta0,theta1,theta2,alpha,epochs):
m = len(Y)
for z in range(epochs):
theta0_grad = 0
theta1_grad = 0
theta2_grad = 0
for i in range(m):
theta0_grad = (theta0+theta1
X[i,0]+theta2X[i,1]-Y[i])
theta1_grad = (theta0+theta1
X[i,0]+theta2X[i,1]-Y[i])X[i,0]
theta2_grad = (theta0+theta1
X[i,0]+theta2
X[i,1]-Y[i])X[i,1]
theta0_grad = theta0_grad/m
theta1_grad = theta1_grad/m
theta2_grad = theta2_grad/m
theta0 -=alpha
theta0_grad
theta1 -=alphatheta1_grad
theta2 -=alpha
theta2_grad
return theta0,theta1,theta2
print(theta0, theta1, theta2)

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/488678.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号