支持向量机SVM序列最小优化算法SMO
支持向量機(jī)(Support Vector Machine)由V.N. Vapnik,A.Y. Chervonenkis,C. Cortes 等在1964年提出。序列最小優(yōu)化算法(Sequential minimal optimization)是一種用于解決支持向量機(jī)訓(xùn)練過(guò)程中所產(chǎn)生優(yōu)化問(wèn)題的算法。由John C. Platt于1998年提出。
支持向量機(jī)的推導(dǎo)在西瓜書,各大網(wǎng)站已經(jīng)有詳細(xì)的介紹。本文主要依據(jù)John C. Platt發(fā)表的文章《Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines》來(lái)實(shí)現(xiàn)SVM與SMO算法。
算法的流程:
import numpy as np
from sklearn import datasets
import matplotlib.pyplot as plt
定義需要的數(shù)據(jù),包含數(shù)據(jù)樣本,數(shù)據(jù)標(biāo)簽,偏置b,拉格朗日乘子α,容忍系數(shù)C等。
class Par:def __init__(self,n,D,C,eps,tol):self.X=datasets.make_blobs(n_samples=n,n_features=D,centers=2,cluster_std=1.0,shuffle=True,random_state=None)self.point=self.X[0]self.target=self.X[1]self.target[np.nonzero(self.target==0)[0]]=-1self.w=np.zeros((1,D))[0]self.b=0self.E=-self.targetself.alpha=np.zeros((1,n))[0]self.n=nself.C=Cself.eps=epsself.tol=tol
定義核函數(shù),預(yù)測(cè)公式。
def kernel(x,y):return np.dot(x,y.T)def f(x):s=0for i in range(n):s+=P.alpha[i]*P.target[i]*kernel(P.point[i],x)return s-P.b
被選中的一對(duì)α更新細(xì)節(jié):
def takeStep(i1,i2):if i1==i2:return 0alph2=P.alpha[i2]alph1=P.alpha[i1]y1=P.target[i1]y2=P.target[i2]s=y1*y2#Compute L,H via equations (13) and (14)if y1!=y2:L=max(0,alph2-alph1)H=min(P.C,P.C+alph2-alph1)else:L=max(0,alph2+alph1-P.C)H=min(P.C,alph2+alph1)if L==H:return 0k11=kernel(P.point[i1],P.point[i1])k12=kernel(P.point[i1],P.point[i2])k22=kernel(P.point[i2],P.point[i2])eta=k11+k22-2*k12if eta>0:a2=alph2+y2*(P.E[i1]-P.E[i2])/etaif a2<L:a2=Lelif a2>H:a2=Helse:f1=y1*(P.E[i1]+b)-alph1*k11-s*alph2*k12f2=y2*(P.E[i2]+b)-s*alph1*k12-alph2*k22L1=alph1+s*(alph2-L)H1=alph1+s*(alph2+H)psiL=L1*f1+L*f2+0.5*L1**2*k11+0.5*L**2*k22+s*L*L1*k12psiH=H1*f1+H*f2+0.5*H1**2*k11+0.5*H**2*k22+s*H*H1*k12Lobj = psiLHobj = psiHif Lobj<Hobj-eps:a2=Lelif Lobj>Hobj+eps:a2=Helse:a2=alph2if abs(a2-alph2)<P.eps*(a2+alph2+P.eps):return 0a1=alph1+s*(alph2-a2)#Update threshold to reflect change in Lagrange multipliersb1=P.E[i1]+y1*(a1-alph1)*k11+y2*(a2-alph2)*k12+P.bb2=P.E[i2]+y1*(a1-alph1)*k12+y2*(a2-alph2)*k22+P.bif a1>0 and a1<P.C:P.b=b1elif a2>0 and a2<P.C:P.b=b2else:P.b=(b1+b2)/2#Update weight vector to reflect change in a1 & a2, if SVM is linearP.w=P.w+y1*(a1-alph1)*P.point[i1]+y2*(a2-alph2)*P.point[i2]#Store a1 in the alpha arrayP.alpha[i1]=a1#Store a2 in the alpha arrayP.alpha[i2]=a2#Update error cache using new Lagrange multipliersP.E[i1]=f(P.point[i1])-P.target[i1]P.E[i2]=f(P.point[i2])-P.target[i2]return 1
內(nèi)循環(huán)選擇第二個(gè)α:
def examineExample(i2):global validalph2=P.alpha[i2]y2=P.target[i2]r2=P.E[i2]*y2if (r2<-P.tol and alph2<P.C) or (r2>P.tol and alph2>0):valid=np.where((P.alpha!=0) & (P.alpha!=C))[0]Long=len(valid)if Long > 1:#i1 = result of second choice heuristic (section 2.2)best=-1if len(valid)>1:for k in valid:deltaE=abs(P.E[i2]-P.E[k])if deltaE>best:best=deltaEi1=kif takeStep(i1,i2):return 1#loop over all non-zero and non-C alpha, starting at a random pointif Long>0:random_index=np.random.randint(0,Long)for i in np.hstack((valid[random_index:Long],valid[0:random_index])):i1=iif takeStep(i1,i2):return 1#loop over all possible i1, starting at a random pointrandom_index=np.random.randint(0,n)for i in np.hstack((np.arange(random_index,n),np.arange(0,random_index))):#i1=loop variablei1=iif takeStep(i1,i2):return 1return 0
外循環(huán)選擇第一個(gè)α:
def SMO():global validnumChanged=0examineAll=1while numChanged>0 or examineAll:numChanged=0if examineAll:for i in range(n):numChanged+=examineExample(i)else:#loop I over examples where alpha is not 0 & not Cfor i in valid:numChanged+=examineExample(i)if examineAll==1:examineAll=0elif numChanged==0:examineAll=1
主函數(shù)入口:
if __name__ == '__main__':n=100 #樣本個(gè)數(shù)C=10 eps=0.001 #停止精度tol=0.001 #分類容錯(cuò)率D=2 #樣本維度P=Par(n,D,C,eps,tol) SMO()#繪制圖像plt.scatter(P.point[:,0],P.point[:,1],c=P.target)x=np.arange(-10,10,0.1)y=(P.b-P.w[0]*x)/P.w[1]plt.plot(x,y)plt.show()Y=kernel(P.point,P.w)-P.bcount=0for i in range(n):if Y[i]*P.target[i]<0:count+=1print('Error Point num:',count)
單次測(cè)試結(jié)果:
總結(jié)
以上是生活随笔為你收集整理的支持向量机SVM序列最小优化算法SMO的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。