日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

机器学习基础-多元线性回归-02

發(fā)布時(shí)間:2024/9/15 编程问答 36 豆豆
生活随笔 收集整理的這篇文章主要介紹了 机器学习基础-多元线性回归-02 小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

矩陣運(yùn)算







多元線性回歸






梯度下降法-多元線性回歸

import numpy as np from numpy import genfromtxt import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D # 讀入數(shù)據(jù) data = genfromtxt(r"Delivery.csv",delimiter=',') print(data)

# 切分?jǐn)?shù)據(jù) x_data = data[:,:-1] y_data = data[:,-1] print(x_data) print(y_data)

# 學(xué)習(xí)率learning rate lr = 0.0001 # 參數(shù) theta0 = 0 theta1 = 0 theta2 = 0 # 最大迭代次數(shù) epochs = 1000# 最小二乘法 def compute_error(theta0, theta1, theta2, x_data, y_data):totalError = 0for i in range(0, len(x_data)):totalError += (y_data[i] - (theta1 * x_data[i,0] + theta2*x_data[i,1] + theta0)) ** 2return totalError / float(len(x_data))def gradient_descent_runner(x_data, y_data, theta0, theta1, theta2, lr, epochs):# 計(jì)算總數(shù)據(jù)量m = float(len(x_data))# 循環(huán)epochs次for i in range(epochs):theta0_grad = 0theta1_grad = 0theta2_grad = 0# 計(jì)算梯度的總和再求平均for j in range(0, len(x_data)):theta0_grad += (1/m) * ((theta1 * x_data[j,0] + theta2*x_data[j,1] + theta0) - y_data[j])theta1_grad += (1/m) * x_data[j,0] * ((theta1 * x_data[j,0] + theta2*x_data[j,1] + theta0) - y_data[j])theta2_grad += (1/m) * x_data[j,1] * ((theta1 * x_data[j,0] + theta2*x_data[j,1] + theta0) - y_data[j])# 更新b和ktheta0 = theta0 - (lr*theta0_grad)theta1 = theta1 - (lr*theta1_grad)theta2 = theta2 - (lr*theta2_grad)return theta0, theta1, theta2 print("Starting theta0 = {0}, theta1 = {1}, theta2 = {2}, error = {3}".format(theta0, theta1, theta2, compute_error(theta0, theta1, theta2, x_data, y_data))) print("Running...") theta0, theta1, theta2 = gradient_descent_runner(x_data, y_data, theta0, theta1, theta2, lr, epochs) print("After {0} iterations theta0 = {1}, theta1 = {2}, theta2 = {3}, error = {4}".format(epochs, theta0, theta1, theta2, compute_error(theta0, theta1, theta2, x_data, y_data)))

ax = plt.figure().add_subplot(111, projection = '3d') ax.scatter(x_data[:,0], x_data[:,1], y_data, c = 'r', marker = 'o', s = 100) #點(diǎn)為紅色三角形 x0 = x_data[:,0] x1 = x_data[:,1] # 生成網(wǎng)格矩陣 x0, x1 = np.meshgrid(x0, x1) z = theta0 + x0*theta1 + x1*theta2 # 畫(huà)3D圖 ax.plot_surface(x0, x1, z) #設(shè)置坐標(biāo)軸 ax.set_xlabel('Miles') ax.set_ylabel('Num of Deliveries') ax.set_zlabel('Time') #顯示圖像 plt.show()

sklearn-多元線性回歸

import numpy as np from numpy import genfromtxt from sklearn import linear_model import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D # 讀入數(shù)據(jù) data = genfromtxt(r"Delivery.csv",delimiter=',') print(data)

# 切分?jǐn)?shù)據(jù) x_data = data[:,:-1] y_data = data[:,-1] print(x_data) print(y_data)

# 創(chuàng)建模型 model = linear_model.LinearRegression() model.fit(x_data, y_data) # 系數(shù) print("coefficients:",model.coef_)# 截距 print("intercept:",model.intercept_)# 測(cè)試 x_test = [[102,4]] predict = model.predict(x_test) print("predict:",predict)

ax = plt.figure().add_subplot(111, projection = '3d') ax.scatter(x_data[:,0], x_data[:,1], y_data, c = 'r', marker = 'o', s = 100) #點(diǎn)為紅色三角形 x0 = x_data[:,0] x1 = x_data[:,1] # 生成網(wǎng)格矩陣 x0, x1 = np.meshgrid(x0, x1) z = model.intercept_ + x0*model.coef_[0] + x1*model.coef_[1] # 畫(huà)3D圖 ax.plot_surface(x0, x1, z) #設(shè)置坐標(biāo)軸 ax.set_xlabel('Miles') ax.set_ylabel('Num of Deliveries') ax.set_zlabel('Time') #顯示圖像 plt.show()

總結(jié)

以上是生活随笔為你收集整理的机器学习基础-多元线性回归-02的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。