beautiful loss function
生活随笔
收集整理的這篇文章主要介紹了
beautiful loss function
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
最初模型,收益增加9%
def mycrossentropy(y_true, y_pred, e=0.001):print('y_pred',y_pred)print('y_true',y_true)b=y_pred[:,:n_classes]b1=y_pred[:,n_classes:]print('b=',b)print('b1',b1) c=y_true[:,:n_classes]c1=y_true[:,n_classes:]print('c=',c)print('c1',c1)loss1 = K.categorical_crossentropy(b, c)loss2 =K.abs( K.sum(b1) -K.sum(c1) )return loss1*(loss2**3/6+loss2**2/2+loss2+1)第二模型,收益增加*%
def mycrossentropy(y_true, y_pred, e=0.001):print('y_pred',y_pred)print('y_true',y_true)b=y_pred[:,:n_classes]b1=y_pred[:,n_classes:]print('b=',b)print('b1',b1) ## c=y_true[:,:n_classes]c1=y_true[:,n_classes:]print('c=',c)print('c1',c1) # b=Dense(n_classes, activation='softmax',name="b")(b) #loss1 = K.categorical_crossentropy(b, c)# loss2 = K.(K.ones_like(y_pred)/nb_classes, y_pred) # loss2 = K.sum(K.square(b1 -c1), axis=-1)l2 = K.sum(K.pow((b1 -c1),2), axis=-1)l3 = K.sum(K.pow((b1 -c1),4), axis=-1)l4 = K.sum(K.pow((b1 -c1),6), axis=-1)# loss2 =K.abs( K.sum(b1) -K.sum(c1) ) # loss2 =K.square( K.sum(b1) -K.sum(c1) ) # loss2 = K.categorical_crossentropy(b, y_true) # return loss1 # return loss1*(loss2**3.6+loss2**2+loss2)return loss1*(l2/2+l3/6+l4/24) def mycrossentropy(y_true, y_pred, e=0.001):print('y_pred',y_pred)print('y_true',y_true)b=y_pred[:,:n_classes]b1=y_pred[:,n_classes:]print('b=',b)print('b1',b1) ## c=y_true[:,:n_classes]c1=y_true[:,n_classes:]print('c=',c)print('c1',c1)loss1 = K.categorical_crossentropy(b, c)ab1 = K.sum(K.pow(K.abs(b1-c1),1), axis=-1)ab2 = K.sum(K.pow(K.abs(b1-c1),2), axis=-1)ab3 = K.sum(K.pow(K.abs(b1-c1),3), axis=-1)ab4 = K.sum(K.pow(K.abs(b1-c1),4), axis=-1)ab5 = K.sum(K.pow(K.abs(b1-c1),5), axis=-1)return loss1*(1+ab1+ab2/2+ab3/6+ab4/24+ab5/120)總結(jié)
以上是生活随笔為你收集整理的beautiful loss function的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: e'eee
- 下一篇: python 写 log