$$\Delta w_i = \eta(y-\hat{y})xi$$
[推导]:此处感知机的模型为:
$$y=f(\sum{i} w_i xi - \theta)$$
将$\theta$看成哑结点后,模型可化简为:
$$y=f(\sum{i} w_i x_i)=f(\boldsymbol w^T \boldsymbol x)$$
其中$f$为阶跃函数。
根据《统计学习方法》§2可知,假设误分类点集合为$M$,$\boldsymbol x_i \in M$为误分类点,$\boldsymbol x_i$的真实标签为$y_i$,模型的预测值为$\hat{y}_i$,对于误分类点$\boldsymbol x_i$来说,此时$\boldsymbol w^T \boldsymbol x_i \gt 0,\hat{y}_i=1,y_i=0$或$\boldsymbol w^T \boldsymbol x_i \lt 0,\hat{y}_i=0,y_i=1$,综合考虑两种情形可得:
$$(\hat{y}_i-y_i)\boldsymbol w^T \boldsymbol xi>0$$
所以可以推得损失函数为:
$$L(\boldsymbol w)=\sum{\boldsymbol x_i \in M} (\hat{y}_i-y_i)\boldsymbol w^T \boldsymbol x_i$$
损失函数的梯度为:
$$\nablaw L(\boldsymbol w)=\sum{\boldsymbol x_i \in M} (\hat{y}_i-y_i)\boldsymbol x_i$$
随机选取一个误分类点$(\boldsymbol x_i,y_i)$,对$\boldsymbol w$进行更新:
$$\boldsymbol w \leftarrow \boldsymbol w-\eta(\hat{y}_i-y_i)\boldsymbol x_i=\boldsymbol w+\eta(y_i-\hat{y}_i)\boldsymbol x_i$$
显然式5.2为$\boldsymbol w$的第$i$个分量$w_i$的变化情况
$$\Delta \theta_j = -\eta g_j$$ [推导]:因为 $$\Delta \theta_j = -\eta \cfrac{\partial E_k}{\partial \theta_j}$$ 又 $$ \begin{aligned} \cfrac{\partial E_k}{\partial \theta_j} &= \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot\cfrac{\partial \hat{y}_j^k}{\partial \theta_j} \ &= (\hat{y}_j^k-y_j^k) \cdot f’(\beta_j-\theta_j) \cdot (-1) \ &= -(\hat{y}_j^k-y_j^k)f’(\beta_j-\theta_j) \ &= g_j \end{aligned} $$ 所以 $$\Delta \theta_j = -\eta \cfrac{\partial E_k}{\partial \theta_j}=-\eta g_j$$
$$\Delta v_{ih} = \eta e_h xi$$ [推导]:因为 $$\Delta v{ih} = -\eta \cfrac{\partial Ek}{\partial v{ih}}$$ 又 $$ \begin{aligned} \cfrac{\partial Ek}{\partial v{ih}} &= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \alpha_h} \cdot \cfrac{\partial \alphah}{\partial v{ih}} \ &= \sum_{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \alpha_h} \cdot xi \ &= \sum{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot f’(\alpha_h-\gamma_h) \cdot xi \ &= \sum{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \betaj} \cdot w{hj} \cdot f’(\alpha_h-\gamma_h) \cdot xi \ &= \sum{j=1}^{l} (-gj) \cdot w{hj} \cdot f’(\alpha_h-\gamma_h) \cdot x_i \ &= -f’(\alpha_h-\gammah) \cdot \sum{j=1}^{l} gj \cdot w{hj} \cdot x_i\ &= -b_h(1-bh) \cdot \sum{j=1}^{l} gj \cdot w{hj} \cdot x_i \ &= -e_h \cdot xi \end{aligned} $$ 所以 $$\Delta v{ih} = -\eta \cdot -e_h \cdot x_i=\eta e_h x_i$$
$$\Delta \gamma_h= -\eta e_h$$ [推导]:因为 $$\Delta \gamma_h = -\eta \cfrac{\partial E_k}{\partial \gamma_h}$$ 又 $$ \begin{aligned} \cfrac{\partial E_k}{\partial \gammah} &= \sum{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot \cfrac{\partial b_h}{\partial \gammah} \ &= \sum{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \beta_j} \cdot \cfrac{\partial \beta_j}{\partial b_h} \cdot f’(\alpha_h-\gammah) \cdot (-1) \ &= -\sum{j=1}^{l} \cfrac{\partial E_k}{\partial \hat{y}_j^k} \cdot \cfrac{\partial \hat{y}_j^k}{\partial \betaj} \cdot w{hj} \cdot f’(\alpha_h-\gamma_h)\ &=e_h \end{aligned} $$ 所以 $$\Delta \gamma_h= -\eta e_h$$