日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

[实践篇] Softmax Regression

發(fā)布時間:2025/3/15 编程问答 23 豆豆
生活随笔 收集整理的這篇文章主要介紹了 [实践篇] Softmax Regression 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

代碼、數(shù)據(jù)已經(jīng)上傳,可以自主下載。https://download.csdn.net/download/shenziheng1/10721992

1. 訓(xùn)練代碼

import numpy as npdef load_data(inputfile):f = open(inputfile)feature_data = []label_data = []for line in f.readlines():feature_tmp = []feature_tmp.append(1) # offsetlines = line.strip().split("\t")for i in xrange( len(lines)-1 ):feature_tmp.append( float(lines[i]) )label_data.append( int(lines[-1]) )feature_data.append( feature_tmp )f.close()return np.mat( feature_data ), np.mat( label_data ).T, len(set(label_data))def gradient_ascent(feature_data, label_data, k, maxCycle, alpha):m, n = np.shape(feature_data)weights = np.mat(np.ones((n,k)))i = 0while i <= maxCycle:err = np.exp(feature_data * weights)if i % 100 == 0:print "\t-------iter: ", i , ", cost: ", cost(err, label_data)rowsum = -err.sum(axis=1)rowsum = rowsum.repeat(k, axis=1)err = err / rowsumfor x in range(m):err[x, label_data[x, 0]] += 1weights = weights + (alpha / m) * feature_data.T * erri += 1return weightsdef cost(err, label_data):m = np.shape(err)[0]sum_cost = 0.0for i in xrange(m):if err[ i, label_data[i,0]] / np.sum(err[i, :]) > 0:sum_cost-= np.log(err[i, label_data[i, 0]] / np.sum(err[i, :]))else:sum_cost -= 0.0return sum_cost / mdef save_model(file_name, weights):f_w = open(file_name, "w")m, n = np.shape(weights)for i in xrange(m):w_tmp = []for j in xrange(n):w_tmp.append( str(weights[i,j]))f_w.write("\t".join(w_tmp) + "\n")f_w.close()if __name__ == "__main__":inputfile = "SoftInput.txt"""" load training data"""print "--------load training data---------"feature, label, k, = load_data(inputfile)"""training Softmax model"""print "--------training Softmax model--------"weights = gradient_ascent(feature, label, k, 100000, 0.4)"""saving final model"""print "--------saving Softmax model--------"save_model("weights", weights)

2.測試代碼

import numpy as np import random as rddef load_data(num, m):testData = np.mat(np.ones((num,m)))for i in xrange(num):testData[i, 1] = rd.random()*6-3testData[i, 2] = rd.random()*15return testDatadef load_weights(weights_path):f = open(weights_path)w = []for line in f.readlines():w_tmp = []lines = line.strip().split("\t")for x in lines:w_tmp.append(float(x))w.append(w_tmp)f.close()weights = np.mat(w)m, n = np.shape(weights)return weights, m, ndef predict(test_data, weights):h = test_data * weightsreturn h.argmax(axis=1) # select maximum prediction as classificationdef save_result(file_name, result):f_result = open(file_name, "w")m = np.shape(result)[0]for i in xrange(m):f_result.write(str( result[i,0] ) + "\n")f_result.close()if __name__ == "__main__":print "--------load mode--------"w, m, n = load_weights("weights")print "--------prediction--------"test_data = load_data(4000, m)print "--------save results--------"result = predict(test_data, w)

3.補充知識

  • set(): 創(chuàng)建一個無序不重復(fù)元素集,可進行關(guān)系測試,刪除重復(fù)數(shù)據(jù),還可以計算交集、差集、并集等。
>>>x = set('runoob') >>> y = set('google') >>> x, y (set(['b', 'r', 'u', 'o', 'n']), set(['e', 'o', 'g', 'l'])) # 重復(fù)的被刪除>>> x & y # 交集 set(['o']) >>> x | y # 并集 set(['b', 'e', 'g', 'l', 'o', 'n', 'r', 'u']) >>> x - y # 差集 set(['r', 'b', 'u', 'n'])
  • sum(axis): 一維向量相加求和的方式。默認情況下相當于axis=1, 每一行相加求和;
>>> np.sum([[0,1,2],[2,1,3],axis=1) array([3,6])

?

總結(jié)

以上是生活随笔為你收集整理的[实践篇] Softmax Regression的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。