神经网络实现xor_在神经网络中实现逻辑门和XOR解决方案
神經網絡實現xor
Neural Networks have risen to prominence in recent years as one of the most powerful machine learning techniques (and over-used buzzwords) in tech. In this post I’ll give a beginner-friendly overview of how Neural Nets work and how they can be used to solve a simple but fundamental problem: representing logic gates. This blog post is based on the book “Neural Networks and Learning Machines” by Simon Haykin if anyone would like to explore the topic in more detail.
近年來,神經網絡作為技術中最強大的機器學習技術(和過度使用的流行詞)之一而受到重視。 在這篇文章中,我將對神經網絡的工作原理以及如何將它們用于解決一個簡單但基本的問題(代表邏輯門)進行初學者友好的概述。 如果有人想更詳細地探討該主題,則本博客文章基于Simon Haykin的《神經網絡和學習機器》一書。
Multi Layer Perceptrons (MLPs) are perhaps the most commonly used form of Neural Net. Standard MLP architecture is feedforward in that activation flows one way, from input to output. An MLP architecture can be broken down in to 3 sections: input layers, hidden layers and output layers. Input layers take data into the network, hidden layers perform processing and output layers send out the result.
多層感知器(MLP)可能是神經網絡最常用的形式。 標準MLP體系結構是前饋的,其激活是一種從輸入到輸出的流動方式。 MLP體系結構可以分為三部分:輸入層,隱藏層和輸出層。 輸入層將數據帶入網絡,隱藏層進行處理,輸出層發送結果。
Perceptrons (like all Neural Nets) are composed of units, which represent neurons in the human brain (on which neural nets are based). A unit takes one or more inputs, sums the inputs together and if the summed value is greater than the value specified in the activation function passes data on. The activation in the perceptrons shown below is 0. A bias unit is another form of unit which always activates, typically sending a 1 to all units to which it is connected. The diagram below shows Perceptron implementations for both AND and OR logic gates.
感知器(像所有神經網絡一樣)由單位組成,這些單位代表人腦中的神經元(神經網絡所基于)。 一個單元接收一個或多個輸入,將輸入加在一起,如果總和大于激活函數中指定的值,則繼續傳遞數據。 下圖所示的感知器中的激活為0。偏置單元是始終激活的另一種形式的單元,通常向與其連接的所有單元發送1。 下圖顯示了AND和OR邏輯門的Perceptron實現。
AND and OR logic gatesAND和OR邏輯門A major issue in the early days of neural net development was representing the XOR function (the logic for which is shown in the table below). However a simple Perceptron with only input and output layers (as used for AND and OR above) could only seperate data with a straight line while XOR is not linearly seperable.
神經網絡開發的早期主要問題是表示XOR函數(其邏輯在下表中顯示)。 但是,只有輸入和輸出層的簡單Perceptron(用于上面的AND和OR)只能用直線分隔數據,而XOR不能線性分隔。
However a workaround was found when it was shown that x1 XOR x2 = (x1 OR x2) AND (NOT(x1 AND x2)). With this knowledge the Perceptron shown below can be developed to represent XOR
但是,當發現x1 XOR x2 =(x1 OR x2)AND(NOT(x1 AND x2))時,找到了一種解決方法。 有了這些知識,可以開發出以下所示的Perceptron來表示XOR
The table below shows that the XOR perceptron returns the correct output for each of the 4 possible input values
下表顯示XOR感知器針對4個可能的輸入值中的每個返回正確的輸出
In conclusion this blog has covered a few simple uses for Perceptrons. Perceptrons, especially MLPs are a very powerful technique with a wide variety of applications.
總之,此博客介紹了Perceptron的一些簡單用法。 感知器,尤其是MLP是一種非常強大的技術,具有廣泛的應用程序。
翻譯自: https://medium.com/analytics-vidhya/implementing-logic-gates-in-neural-nets-and-a-solution-for-xor-ebf68cf8109b
神經網絡實現xor
總結
以上是生活随笔為你收集整理的神经网络实现xor_在神经网络中实现逻辑门和XOR解决方案的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 外媒称谷歌无限福利时代已过,大裁员中27
- 下一篇: 《超级马里奥兄弟大电影》最新预告片:马里