Explorar el Código

修订Chapter5

Sm1les hace 7 años
padre
commit
86ecafdebb
Se han modificado 3 ficheros con 76 adiciones y 47 borrados
  1. 12 0
      Chapter5/READEME.md
  2. 63 47
      Chapter5/chapter5.md
  3. 1 0
      Chapter7/README.md

+ 12 - 0
Chapter5/READEME.md

@@ -0,0 +1,12 @@
+# 第5章 神经网络
+
+### 5.1 神经元模型
+### 5.2 感知机与多层网络
+- [5.2](https://github.com/Datawhale18/pumpkin-book/blob/master/Chapter5/chapter5.md)
+### 5.3 误差逆传播算法
+- [5.12](https://github.com/Datawhale18/pumpkin-book/blob/master/Chapter5/chapter5.md)
+- [5.13](https://github.com/Datawhale18/pumpkin-book/blob/master/Chapter5/chapter5.md)
+- [5.14](https://github.com/Datawhale18/pumpkin-book/blob/master/Chapter5/chapter5.md)
+### 5.4 全局最小与局部极小
+### 5.5 其他常见神经网络
+### 5.6 深度学习

+ 63 - 47
Chapter5/chapter5.md

@@ -1,48 +1,64 @@
-# 5.1 神经元模型
-# 5.2 感知机与多层网络
-# 5.3 误差逆传播算法
-对于训练例$(x_k,y_k)$, 假定神经网络输出为 $\hat{y}_k = (\hat{y}_1^k , \hat{y}_2^k,...,\hat{y}_l^k)$
-$$ \hat{y}_j^k = f(\beta_j - \theta_j) $$ (5.3)
-设网络在 $(x_k,y_k)$的均方误差为
- $$ E_k = \frac{1}{2} \sum^l_{j=1}(\hat{y}_j^k -y_j^k)^2$$  (5.4)
-
-> 这里 1/2是为了求导方便消去系数,只是个规定
-
-设 学习率为 $\eta$, 以目标的负梯度方向对参数进行调整的含义是:
-目标函数(均方误差 $E_k$)在 $w_{hj}$处的负梯度对参数进行调整:
-$$ \Delta w_{hj} = - \eta\frac{\partial E_k}{\partial w_{hj}}$$ (5.6)
-
-根据链式法则:
-$$ \frac{\partial E_k}{\partial w_{hj}} = \frac{\partial E_k}{\partial \hat{y}_j^k}\frac{\partial \hat{y}_j^k}{\partial \beta_j}\frac{\partial \beta_j}{\partial w_{hj}}$$ (5.7)
-
-根据 $\beta_j$ 的定义,
-$$ \beta_j = \sum^q_{h=1} w_{hj}b_h$$
-
-有
-$$\frac{\partial \beta_j}{\partial w_{hj}} = b_h$$ (5.8)
-图 5.2 的Sigmoid 函数为
-$$ sigmoid(x) = \frac{1}{1+e^{-x}}$$
+### 5.2
+$$\Delta w_i = \eta(y-\hat{y})x_i$$
+[推导]:此处感知机的模型为:
+$$y=f(\sum_{i} w_i x_i - \theta)$$
+将$\theta$看成哑结点后,模型可化简为:
+$$y=f(\sum_{i} w_i x_i)=f(\boldsymbol w^T \boldsymbol x)$$
+其中$f$为阶跃函数。<br>根据《统计学习方法》§2可知,假设误分类点集合为$M$,$\boldsymbol x_i \in M$为误分类点,$\boldsymbol x_i$的真实标签为$y_i$,模型的预测值为$\hat{y_i}$,对于误分类点$\boldsymbol x_i$来说,此时$\boldsymbol w^T \boldsymbol x_i \gt 0,\hat{y_i}=1,y_i=0$或$\boldsymbol w^T \boldsymbol x_i \lt 0,\hat{y_i}=0,y_i=1$,综合考虑两种情形可得:
+$$(\hat{y_i}-y_i)\boldsymbol w \boldsymbol x_i>0$$
+所以可以推得损失函数为:
+$$L(\boldsymbol w)=\sum_{\boldsymbol x_i \in M} (\hat{y_i}-y_i)\boldsymbol w \boldsymbol x_i$$
+损失函数的梯度为:
+$$\nabla_w L(\boldsymbol w)=\sum_{\boldsymbol x_i \in M} (\hat{y_i}-y_i)\boldsymbol x_i$$
+随机选取一个误分类点$(\boldsymbol x_i,y_i)$,对$\boldsymbol w$进行更新:
+$$\boldsymbol w \leftarrow \boldsymbol w-\eta(\hat{y_i}-y_i)\boldsymbol x_i=\boldsymbol w+\eta(y_i-\hat{y_i})\boldsymbol x_i$$
+显然式5.2为$\boldsymbol w$的第$i$个分量$w_i$的变化情况
+### 5.12
+$$\Delta \theta_j = -\eta g_j$$
+[推导]:因为
+$$\Delta \theta_j = -\eta \cfrac{\partial E_k}{\partial \theta_j}$$
+又
+$$
+\begin{aligned}	
+\cfrac{\partial E_k}{\partial \theta_j} &= \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot\cfrac{\partial \hat{y_j^k}}{\partial \theta_j} \\\\
+&= (\hat{y_j^k}-y_j^k) \cdot f’(\beta_j-\theta_j) \cdot (-1) \\\\
+&= -(\hat{y_j^k}-y_j^k)f’(\beta_j-\theta_j) \\\\
+&= g_j
+\end{aligned}
+$$
 所以
-$$ f' = \frac{e^{-x}}{(1+e^{-x})^2} = f(x)(1-f(x))$$ (5.9)
-根据 5.3和 5.4
-$$  - \frac{\partial E_k}{\partial \hat{y}_j^k}\frac{\partial \hat{y}_j^k}{\partial \beta_j} = - (\hat{y}_j^k - y_j^k)f'(\beta_j - \theta_j) = - (\hat{y}_j^k - y_j^k)[\hat{y}_j^k(1-\hat{y}_j^k)] = \hat{y}_j^k(1-\hat{y}_j^k)(y_j^k - \hat{y}_j^k) = g_j$$  
-(5.10)
->$g_j$ 是设出来的变量
-
-将 (5.10)和(5.8)带入(5.7)再带入5.6,可以得到BP算法中关于$w_{hj}$的更新公式
-$$ \Delta w_{hj} = \eta g_j b_h$$ (5.11)
-
-同理有:
-
-$$\Delta\theta_j = -\eta g_j $$ (5.12)
-
-> 注意这里是把 $\theta_j$ 看做为输入为 -1,系数或者说权重为 $\theta$ 的w, 和$\Delta w_{hj}$ 具有等价的地位, 取 $b_h = -1$ 带入(5.11)得到的
-$$\Delta v_{ih} = -\eta \frac{\partial E_k}{\partial v_{ih}}  $$
-$$\frac{\partial E_k}{\partial v_{ih}} = \frac{E_k}{\partial{b_h}}\frac{\partial b_h}{\partial \alpha_h}\frac{\partial \alpha_h}{\partial x_i} = e_hx_i$$
-
-所以 
-$$\Delta v_{ih} = -\eta e_h x_i  $$ (5.13)
-这里的
- $$e_h = \frac{E_k}{\partial{b_h}}\frac{\partial b_h}{\partial \alpha_h} = -\sum_{j=1}^l \frac{\partial E_k}{\partial \beta_j}\frac{\partial \beta_j}{\partial b_h}f'(\alpha - \gamma_h) = \sum_{j=1}^lw_{hj}g_jf'(\alpha_h - \gamma_h) = b_h(1-b_h)\sum_{j=1}^lw_{hj}g_j$$ 
-同(5.12)理得到
-$$\Delta \gamma_h = -\eta e_h$$ (5.14)
+$$\Delta \theta_j = -\eta \cfrac{\partial E_k}{\partial \theta_j}=-\eta g_j$$
+### 5.13
+$$\Delta v_{ih} = \eta e_h x_i$$
+[推导]:因为
+$$\Delta v_{ih} = -\eta \cfrac{\partial E_k}{\partial v_{ih}}$$
+又
+$$
+\begin{aligned}	
+\cfrac{\partial E_k}{\partial v_{ih}} &= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \alpha_h} \cdot \cfrac{\partial \alpha_h}{\partial v_{ih}} \\\\
+&= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \alpha_h} \cdot x_i \\\\ 
+&= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot f’(\alpha_h-\gamma_h) \cdot x_i \\\\
+&= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot w_{hj} \cdot f’(\alpha_h-\gamma_h) \cdot x_i \\\\
+&= \sum_{j=1}^{l} (-g_j) \cdot w_{hj} \cdot f’(\alpha_h-\gamma_h) \cdot x_i \\\\
+&= -f’(\alpha_h-\gamma_h) \cdot \sum_{j=1}^{l} g_j \cdot w_{hj}  \cdot x_i\\\\
+&= -b_h(1-b_h) \cdot \sum_{j=1}^{l} g_j \cdot w_{hj}  \cdot x_i \\\\
+&= -e_h \cdot x_i
+\end{aligned}
+$$
+所以
+$$\Delta v_{ih} = -\eta \cdot -e_h \cdot x_i=\eta e_h x_i$$
+### 5.14
+$$\Delta \gamma_h= -\eta e_h$$
+[推导]:因为
+$$\Delta \gamma_h = -\eta \cfrac{\partial E_k}{\partial \gamma_h}$$
+又
+$$
+\begin{aligned}	
+\cfrac{\partial E_k}{\partial \gamma_h} &= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \gamma_h} \\\\
+&= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot f’(\alpha_h-\gamma_h) \cdot (-1) \\\\
+&= -\sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y_j^k}} \cdot \cfrac{\partial \hat{y_j^k}}{\partial \beta_j} \cdot w_{hj} \cdot f’(\alpha_h-\gamma_h)\\\\
+&=e_h
+\end{aligned}
+$$
+所以
+$$\Delta \gamma_h= -\eta e_h$$

+ 1 - 0
Chapter7/README.md

@@ -8,4 +8,5 @@
 ### 7.2 极大似然估计
 ### 7.3 朴素贝叶斯分类器
 ### 7.4 半朴素贝叶斯分类器
+### 7.5 贝叶斯网
 ### 7.6 EM算法