单层神经网络线性回归_单层神经网络| 使用Python的线性代数
單層神經網絡線性回歸
A neural network is a powerful tool often utilized in Machine Learning because neural networks are fundamentally very mathematical. We will use our basics of Linear Algebra and NumPy to understand the foundation of Machine Learning using Neural Networks.
神經網絡是機器學習中經常使用的強大工具,因為神經網絡從根本上說是非常數學的。 我們將使用線性代數和NumPy的基礎知識來理解使用神經網絡進行機器學習的基礎。
Our article is a showcase of the application of Linear Algebra and, Python provides a wide set of libraries that help to build our motivation of using Python for machine learning.
我們的文章展示了線性代數的應用,Python提供了廣泛的庫,有助于建立我們使用Python進行機器學習的動機。
The figure is showing a neural network with multi-input and one output node.
該圖顯示了一個具有多輸入和一個輸出節點的神經網絡。
Input to the neural network is X1, X2, ?X3……... Xn and their corresponding weights are w1, w2, w3………..wn respectively. The output z is a tangent hyperbolic function for decision making which has input as the sum of products of Input and Weight.
輸入到神經網絡的是X 1 , X 2 , X 3 ……... X n ,它們的相應權重分別是w 1 , w 2 ,w 3 ………..w n 。 輸出z是決策的切線雙曲函數,其輸入為輸入與權重的乘積之和。
Mathematically,??z = tanh(∑ Xiwi)
數學上z = tanh(∑ X i w i )
Where tanh( ) is an tangent hyperbolic function (Refer article Linear Algebra | Tangent Hyperbolic Function) because it is one of the most used decision-making functions.
其中tanh()是切線雙曲函數(請參閱線性代數|切線雙曲函數),因為它是最常用的決策函數之一。
So for drawing this mathematical network in a python code by defining a function neural_network( X, W). Note: The tangent hyperbolic function takes input within the range of 0 to 1.
因此,通過定義函數Neuro_network(X,W)以python代碼繪制此數學網絡。 注意:切線雙曲函數的輸入范圍為0到1。
Input parameter(s): Vector X and W
輸入參數:向量X和W
Return: A value ranging between 0 and 1, as a prediction of the neural network based on the inputs.
返回值:一個介于0到1之間的值,作為基于輸入的神經網絡的預測。
Application:
應用:
Machine Learning
機器學習
Computer Vision
計算機視覺
Data Analysis
數據分析
Fintech
金融科技
單層神經網絡的Python程序 (Python program for Uni - Layer Neural Network)
#Linear Algebra and Neural Network #Linear Algebra Learning Sequenceimport numpy as np#Use of np.array() to define an Input Vector inp = np.array([0.323, 0.432, 0.546, 0.44, 0.123, 0.78, 0.123]) print("The Vector A : ",inp)#defining Weight Vector weigh = np.array([0.3, 0.63, 0.99, 0.89, 0.50, 0.66, 0.123]) print("\nThe Vector B : ",weigh)#defining a neural network for predicting an output value def neural_network(inputs, weights):wT = np.transpose(weights)elpro = wT.dot(inputs)#Tangent Hyperbolic Function for Decision Makingout = np.tanh(elpro)return outoutputi = neural_network(inp,weigh)#printing the expected output print("\nExpected Output of the given Input data and their respective Weight : ", outputi)Output:
輸出:
The Vector A : [0.323 0.432 0.546 0.44 0.123 0.78 0.123]The Vector B : [0.3 0.63 0.99 0.89 0.5 0.66 0.123]Expected Output of the given Input data and their respective Weight : 0.9556019596251646翻譯自: https://www.includehelp.com/python/uni-layer-neural-network.aspx
單層神經網絡線性回歸
總結
以上是生活随笔為你收集整理的单层神经网络线性回归_单层神经网络| 使用Python的线性代数的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 账单cbl_CBL的完整形式是什么?
- 下一篇: python insert_Python