Create_AI_Framework_In5Classes(Day2) 的ForwardPropagation.py代码:
# -*- coding: utf-8 -*-
#完成从Input Layer,经过若干层的Hidden Layers,最后得出Output Layer的值
import math
class ForwardPropagation:
def applyForwardPropagation(nodes, weights, instance):
for i in range(len(nodes)):
if nodes[i].get_is_bias_unit() == True:
nodes[i].set_value = 1
# 把数据输入到Input Layer
#例如说处理instance = [0,1,1]
for j in range(len(instance) - 1): #训练的时候只需要features
value_of_feature = instance[j] #获得该条数据中每个Feature具体的值
for k in range(len(nodes)):
if j + 1 == nodes[k].get_index(): #索引为0的节点为Bias,所以从索引为1的Node开始
nodes[k].set_value(value_of_feature)
#Hidden Layer的处理
for j in range(len(nodes)):
if nodes[j].get_is_bias_unit() == False and nodes[j].get_level() > 0 :
target_neuron_input = 0 #接受上一个Layer中所有的和自己相关的Neurons和Weights的乘积之和
target_neuron_output = 0 #经过Non-Linearity后的输出,我们这里使用Sigmoid
#获得当前Neuron的ID
target_index = nodes[j].get_index()
for k in range(len(weights)):
#获得和当前的Neuron关联的Weight
if target_index == weights[k].get_to_index():
#获得该Weight的Value
weight_value = weights[k].get_value()
#获得该Weight的来源的Neuron的ID
from_index = weights[k].get_from_index()
#获得该来源Neuron的Value
for m in range(len(nodes)):
#获得该ID的Neuron
if from_index == nodes[m].get_index():
#获得该Neuron的具体的Value
value_from_neuron = nodes[m].get_value()
#把Weight和相应的Value相乘,然后累加
target_neuron_input = target_neuron_input + (weight_value * value_from_neuron)
#一个Weight只连接一个上一个Layer的Neuron,所以不需要继续循环真个神经网络的其它Neurons'
break
#从和Break对其出发,一共按了4次后退键,因为接下来是要应用Sigmoid多当前的Neuron的所有的输入累计后的值进行操作
target_neuron_output = 1 / (1 + math.exp(- target_neuron_input))
#接下来把输入值和当然Neuron采用Sigmoid Activation计算后的值设置进当前的Neuron
nodes[j].set_input_value(target_neuron_input)
nodes[j].set_value(target_neuron_output)
(3) 测试结果。
在入口程序Neuron_Network_Entry.py中测试结果。输出结果的值是最后一个节点的值,
得出此次Forward Propagation的输出结果,从所有的节点中找到最后一个节点,打印出预测的值。
Neuron_Network_Entry.py运行结果如下:
+1 V1 V2
Hidden layer creation: 1 N[1][1] N[1][2] N[1][3] N[1][4] N[1][5] N[1][6] N[1][7] N[1][8]
Hidden layer creation: 2 N[2][1] N[2][2] N[2][3] N[2][4]
Hidden layer creation: 3 N[3][1] N[3][2]
Output layer: Output
The weight from 1 at layers[0] to 4 at layers[1] : -0.01719299063527102
The weight from 1 at layers[0] to 5 at layers[1] : -0.8524173292229386
The weight from 1 at layers[0] to 6 at layers[1] : 0.22699060934105253
The weight from 1 at layers[0] to 7 at layers[1] : -0.18342643007293868
The weight from 1 at layers[0] to 8 at layers[1] : 0.535174965756674
The weight from 1 at layers[0] to 9 at layers[1] : -0.14676978791733708
The weight from 1 at layers[0] to 10 at layers[1] : 0.3575707340850214
The weight from 1 at layers[0] to 11 at layers[1] : -0.42137618665671717
The weight from 2 at layers[0] to 4 at layers[1] : -0.8009065486052088
The weight from 2 at layers[0] to 5 at layers[1] : 0.058917238487063095
The weight from 2 at layers[0] to 6 at layers[1] : -0.42346508944034544
The weight from 2 at layers[0] to 7 at layers[1] : 0.8426870154158392
The weight from 2 at layers[0] to 8 at layers[1] : 0.32010217521550643
The weight from 2 at layers[0] to 9 at layers[1] : -0.18659699268657703
The weight from 2 at layers[0] to 10 at layers[1] : -0.21967241566753914
The weight from 2 at layers[0] to 11 at layers[1] : -0.24400451550197744
The weight from 4 at layers[1] to 13 at layers[2] : 1.012406950277446
The weight from 4 at layers[1] to 14 at layers[2] : -0.7119667051463217
The weight from 4 at layers[1] to 15 at layers[2] : 0.6123794505814086
The weight from 4 at layers[1] to 16 at layers[2] : 0.20933909060981204
The weight from 5 at layers[1] to 13 at layers[2] : 0.8295825393038667
The weight from 5 at layers[1] to 14 at layers[2] : -0.18589793075961192
The weight from 5 at layers[1] to 15 at layers[2] : -0.4965519410696049
The weight from 5 at layers[1] to 16 at layers[2] : 0.8986794993436826
The weight from 6 at layers[1] to 13 at layers[2] : -0.5419190030559935
The weight from 6 at layers[1] to 14 at layers[2] : -0.030482481689729557
The weight from 6 at layers[1] to 15 at layers[2] : -0.16049573458078903
The weight from 6 at layers[1] to 16 at layers[2] : -0.2974003908293369
The weight from 7 at layers[1] to 13 at layers[2] : 0.2390039732386664
The weight from 7 at layers[1] to 14 at layers[2] : 0.5368392670597157
The weight from 7 at layers[1] to 15 at layers[2] : -0.38067640003252334
The weight from 7 at layers[1] to 16 at layers[2] : 0.08850351612527696
The weight from 8 at layers[1] to 13 at layers[2] : -0.45517979672262143
The weight from 8 at layers[1] to 14 at layers[2] : -0.48321818662131666
The weight from 8 at layers[1] to 15 at layers[2] : 0.5723650651069483
The weight from 8 at layers[1] to 16 at layers[2] : 0.20266673558402148
The weight from 9 at layers[1] to 13 at layers[2] : 0.4648137370852401
The weight from 9 at layers[1] to 14 at layers[2] : -0.9281727938087562
The weight from 9 at layers[1] to 15 at layers[2] : -0.3137211374881055
The weight from 9 at layers[1] to 16 at layers[2] : -0.9786522293599609
The weight from 10 at layers[1] to 13 at layers[2] : 0.7513533393983693
The weight from 10 at layers[1] to 14 at layers[2] : 0.6710274413316895
The weight from 10 at layers[1] to 15 at layers[2] : 0.9971668767938549
The weight from 10 at layers[1] to 16 at layers[2] : 0.34557762622192767
The weight from 11 at layers[1] to 13 at layers[2] : -0.8777163395171548
The weight from 11 at layers[1] to 14 at layers[2] : -0.05999121610430269
The weight from 11 at layers[1] to 15 at layers[2] : -0.6652778238381963
The weight from 11 at layers[1] to 16 at layers[2] : 0.08832878707869818
The weight from 13 at layers[2] to 18 at layers[3] : -0.8324236408834976
The weight from 13 at layers[2] to 19 at layers[3] : -0.7635884380604687
The weight from 14 at layers[2] to 18 at layers[3] : 0.8092754640273179
The weight from 14 at layers[2] to 19 at layers[3] : -0.09275930457566717
The weight from 15 at layers[2] to 18 at layers[3] : -0.6719236795958695
The weight from 15 at layers[2] to 19 at layers[3] : -0.8016890418335867
The weight from 16 at layers[2] to 18 at layers[3] : 1.0029644245873488
The weight from 16 at layers[2] to 19 at layers[3] : -0.8780918045481161
The weight from 18 at layers[3] to 20 at layers[4] : -0.8007053462211116
The weight from 19 at layers[3] to 20 at layers[4] : 0.9594412274529027
Prediction: 0.4477145403917822
Prediction: 0.44384068472048943
Prediction: 0.44955657205136573
Prediction: 0.44559590058743986
instances中有4条数据,从输入数据的角度,第1条数据是[0,0],第2条数据是
[0,1],第3条数据是[1,0],第4条数据是[1,1]。结果中第一条数据的预测值是Prediction: 0.4477145403917822,但实际结果是0;第2条数据的预测值是Prediction: 0.44384068472048943,但实际结果是1;第3条数据的预测值是Prediction:0.44955657205136573,但实际结果是1;第4条数据的预测值是Prediction: 0.44559590058743986,但实际结果是0;我们只运行了1次,使用Sigmoid激活函数发现第1列和第2列跟第3列结果的关系,第1次运行的结果是不准确的。在下一章节中,我们将从结果中反推执行的过程,看每个节点关联的权重对误差造成的影响,根据我们的算法调整这个影响,随着循环过程的进行,误差会越来越小。TensorFlow的可视化图中,开始的误差是50%左右,我们这里的开始误差也很大,根据结果的计算,要么是0,要么是1,而我们的预测值如0.4477145403917822等,随着程序的运行,误差越来越小,而要完成这个过程,要根据结果反推以前的每一步,看权重值是否需要调整,调整权重之后,下一次再运行,神经元具体的值也会改变,因为权重发生了改变,不断的调整将越来越接近目标。