日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 >

[实践篇] Softmax Regression

發布時間:2025/3/15 31 豆豆
生活随笔 收集整理的這篇文章主要介紹了 [实践篇] Softmax Regression 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

代碼、數據已經上傳,可以自主下載。https://download.csdn.net/download/shenziheng1/10721992

1. 訓練代碼

import numpy as npdef load_data(inputfile):f = open(inputfile)feature_data = []label_data = []for line in f.readlines():feature_tmp = []feature_tmp.append(1) # offsetlines = line.strip().split("\t")for i in xrange( len(lines)-1 ):feature_tmp.append( float(lines[i]) )label_data.append( int(lines[-1]) )feature_data.append( feature_tmp )f.close()return np.mat( feature_data ), np.mat( label_data ).T, len(set(label_data))def gradient_ascent(feature_data, label_data, k, maxCycle, alpha):m, n = np.shape(feature_data)weights = np.mat(np.ones((n,k)))i = 0while i <= maxCycle:err = np.exp(feature_data * weights)if i % 100 == 0:print "\t-------iter: ", i , ", cost: ", cost(err, label_data)rowsum = -err.sum(axis=1)rowsum = rowsum.repeat(k, axis=1)err = err / rowsumfor x in range(m):err[x, label_data[x, 0]] += 1weights = weights + (alpha / m) * feature_data.T * erri += 1return weightsdef cost(err, label_data):m = np.shape(err)[0]sum_cost = 0.0for i in xrange(m):if err[ i, label_data[i,0]] / np.sum(err[i, :]) > 0:sum_cost-= np.log(err[i, label_data[i, 0]] / np.sum(err[i, :]))else:sum_cost -= 0.0return sum_cost / mdef save_model(file_name, weights):f_w = open(file_name, "w")m, n = np.shape(weights)for i in xrange(m):w_tmp = []for j in xrange(n):w_tmp.append( str(weights[i,j]))f_w.write("\t".join(w_tmp) + "\n")f_w.close()if __name__ == "__main__":inputfile = "SoftInput.txt"""" load training data"""print "--------load training data---------"feature, label, k, = load_data(inputfile)"""training Softmax model"""print "--------training Softmax model--------"weights = gradient_ascent(feature, label, k, 100000, 0.4)"""saving final model"""print "--------saving Softmax model--------"save_model("weights", weights)

2.測試代碼

import numpy as np import random as rddef load_data(num, m):testData = np.mat(np.ones((num,m)))for i in xrange(num):testData[i, 1] = rd.random()*6-3testData[i, 2] = rd.random()*15return testDatadef load_weights(weights_path):f = open(weights_path)w = []for line in f.readlines():w_tmp = []lines = line.strip().split("\t")for x in lines:w_tmp.append(float(x))w.append(w_tmp)f.close()weights = np.mat(w)m, n = np.shape(weights)return weights, m, ndef predict(test_data, weights):h = test_data * weightsreturn h.argmax(axis=1) # select maximum prediction as classificationdef save_result(file_name, result):f_result = open(file_name, "w")m = np.shape(result)[0]for i in xrange(m):f_result.write(str( result[i,0] ) + "\n")f_result.close()if __name__ == "__main__":print "--------load mode--------"w, m, n = load_weights("weights")print "--------prediction--------"test_data = load_data(4000, m)print "--------save results--------"result = predict(test_data, w)

3.補充知識

  • set(): 創建一個無序不重復元素集,可進行關系測試,刪除重復數據,還可以計算交集、差集、并集等。
>>>x = set('runoob') >>> y = set('google') >>> x, y (set(['b', 'r', 'u', 'o', 'n']), set(['e', 'o', 'g', 'l'])) # 重復的被刪除>>> x & y # 交集 set(['o']) >>> x | y # 并集 set(['b', 'e', 'g', 'l', 'o', 'n', 'r', 'u']) >>> x - y # 差集 set(['r', 'b', 'u', 'n'])
  • sum(axis): 一維向量相加求和的方式。默認情況下相當于axis=1, 每一行相加求和;
>>> np.sum([[0,1,2],[2,1,3],axis=1) array([3,6])

?

總結

以上是生活随笔為你收集整理的[实践篇] Softmax Regression的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。