cntk-notes
cntk Embedding layer
“Embedding” refers to representing words or other discrete items by dense continuous vectors. This layer assumes that the input is in one-hot form. E.g., for a vocabulary size of 10,000, each input vector is expected to have dimension 10,000 and consist of zeroes except for one position that contains a 1. The index of that location is the index of the word or item it represents.
通過上面一段話,Embedding layer的工作就是把 one_hot類型的向量轉換為dense continuous vector。該轉換過程是通過矩陣相乘實現的,但是當 vocabulary size 很大時, 為了提高矩陣乘法的效率,在進行相乘之前,需要將one_hot類型向量變換為稀疏形式的寫法,這種轉換是通過參數 is_sparse=True來實現的:
input = C.input_variable(shape=(784, ), is_sparse=True)
轉載于:https://www.cnblogs.com/huizhu135/p/6044860.html
總結
以上是生活随笔為你收集整理的cntk-notes的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: vijos p1002——过河(noip
- 下一篇: 期货黄金与现货黄金比较