1.9 程序示例--局部加权线性回归-机器学习笔记-斯坦福吴恩达教授
生活随笔
收集整理的這篇文章主要介紹了
1.9 程序示例--局部加权线性回归-机器学习笔记-斯坦福吴恩达教授
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
程序示例–局部加權線性回歸
現在,我們在回歸中又添加了 JLwr() 方法用于計算預測代價,以及 lwr() 方法用于完成局部加權線性回歸:
# coding: utf-8 # linear_regression/regression.py# ...def JLwr(theta, X, y, x, c):"""局部加權線性回歸的代價函數計算式Args:theta: 相關系數矩陣X: 樣本集矩陣y: 標簽集矩陣x: 待預測輸入c: tauReturns:預測代價"""m,n = X.shapesummerize = 0for i in range(m):diff = (X[i]-x)*(X[i]-x).Tw = np.exp(-diff/(2*c*c))predictDiff = np.power(y[i] - X[i]*theta,2)summerize = summerize + w*predictDiffreturn summerize@exeTime def lwr(rate, maxLoop, epsilon, X, y, x, c=1):"""局部加權線性回歸Args:rate: 學習率maxLoop: 最大迭代次數epsilon: 預測精度X: 輸入樣本y: 標簽向量x: 待預測向量c: tau"""m,n = X.shape# 初始化thetatheta = np.zeros((n,1))count = 0converged = Falseerror = float('inf')errors = []thetas = {}for j in range(n):thetas[j] = [theta[j,0]]# 執行批量梯度下降while count<=maxLoop:if(converged):breakcount = count + 1for j in range(n):deriv = (y-X*theta).T*X[:, j]/mtheta[j,0] = theta[j,0]+rate*derivthetas[j].append(theta[j,0])error = JLwr(theta, X, y, x, c)errors.append(error[0,0])# 如果已經收斂if(error < epsilon):converged = Truereturn theta,errors,thetas# ...測試
# coding: utf-8 # linear_regression/test_lwr.py import regression import matplotlib.pyplot as plt import matplotlib.ticker as mtick import numpy as npif __name__ == "__main__":srcX, y = regression.loadDataSet('data/lwr.txt');m,n = srcX.shapesrcX = np.concatenate((srcX[:, 0], np.power(srcX[:, 0],2)), axis=1)# 特征縮放X = regression.standardize(srcX.copy())X = np.concatenate((np.ones((m,1)), X), axis=1)rate = 0.1maxLoop = 1000epsilon = 0.01predicateX = regression.standardize(np.matrix([[8, 64]]))predicateX = np.concatenate((np.ones((1,1)), predicateX), axis=1)result, t = regression.lwr(rate, maxLoop, epsilon, X, y, predicateX, 1)theta, errors, thetas = resultresult2, t = regression.lwr(rate, maxLoop, epsilon, X, y, predicateX, 0.1)theta2, errors2, thetas2 = result2# 打印特征點fittingFig = plt.figure()title = 'polynomial with bgd: rate=%.2f, maxLoop=%d, epsilon=%.3f'%(rate,maxLoop,epsilon)ax = fittingFig.add_subplot(111, title=title)trainingSet = ax.scatter(srcX[:, 0].flatten().A[0], y[:,0].flatten().A[0])print thetaprint theta2# 打印擬合曲線xx = np.linspace(1, 7, 50)xx2 = np.power(xx,2)yHat1 = []yHat2 = []for i in range(50):normalizedSize = (xx[i]-xx.mean())/xx.std(0)normalizedSize2 = (xx2[i]-xx2.mean())/xx2.std(0)x = np.matrix([[1,normalizedSize, normalizedSize2]])yHat1.append(regression.h(theta, x.T))yHat2.append(regression.h(theta2, x.T))fittingLine1, = ax.plot(xx, yHat1, color='g')fittingLine2, = ax.plot(xx, yHat2, color='r')ax.set_xlabel('temperature')ax.set_ylabel('yield')plt.legend([trainingSet, fittingLine1, fittingLine2], ['Training Set', r'LWR with $\tau$=1', r'LWR with $\tau$=0.1'])plt.show()# 打印誤差曲線errorsFig = plt.figure()ax = errorsFig.add_subplot(111)ax.yaxis.set_major_formatter(mtick.FormatStrFormatter('%.2e'))ax.plot(range(len(errors)), errors)ax.set_xlabel('Number of iterations')ax.set_ylabel('Cost J')plt.show()在測試程序中,我們分別對 τττ 取值 0.1 和 1 ,得到了不同的擬合曲線:
總結
以上是生活随笔為你收集整理的1.9 程序示例--局部加权线性回归-机器学习笔记-斯坦福吴恩达教授的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 1.8 欠拟合和过拟合-机器学习笔记-斯
- 下一篇: 2.1 0/1分类问题-机器学习笔记-斯