SMO(Sequential minimal optimization)算法的详细实现过程 | 您所在的位置:网站首页 › smo算法例题 › SMO(Sequential minimal optimization)算法的详细实现过程 |
SMO算法主要是为优化SVM(支持向量机)的求解而产生的,SVM的公式基本上都可以推到如下这步: m a x α ∑ i = 1 m α i − 1 2 ∑ i = 1 m ∑ j = 1 m α i α j y i y j x i T x j max_{\alpha}\sum_{i=1}^{m}\alpha_{i}-\frac{1}{2}\sum_{i=1}^{m}\sum_{j=1}^{m}\alpha_{i}\alpha_{j}y_{i}y_{j}x_{i}^{T}x_{j} maxα∑i=1mαi−21∑i=1m∑j=1mαiαjyiyjxiTxj s . t . ∑ i m α i y i = 0 s.t. \sum_{i}^{m}\alpha_{i}y_{i}=0 s.t.∑imαiyi=0 0 ≤ α i ≤ C , i = 1 , 2 , 3 , . . . , m 0≤\alpha_{i}≤C,i = 1, 2, 3,...,m 0≤αi≤C,i=1,2,3,...,m 其中,C是SVM中惩罚参数(或正则化常数),可令: φ ( α ) = ∑ i = 1 m α i − 1 2 ∑ i = 1 m ∑ j = 1 m α i α j y i y j x i T x j \varphi(\alpha)=\sum_{i=1}^{m}\alpha_{i}-\frac{1}{2}\sum_{i=1}^{m}\sum_{j=1}^{m}\alpha_{i}\alpha_{j}y_{i}y_{j}x_{i}^{T}x_{j} φ(α)=∑i=1mαi−21∑i=1m∑j=1mαiαjyiyjxiTxj SMO的具体步骤:第一步:为了满足 ∑ i m α i y i = 0 \sum_{i}^{m}\alpha_{i}y_{i}=0 ∑imαiyi=0公式,首先要固定两个变量 α i 和 α j \alpha_{i}和\alpha_{j} αi和αj,这里以 α 1 和 α 2 \alpha_{1}和\alpha_{2} α1和α2为例,其余的 α i ( i = 3 , 4 , . . . , m ) 都 是 已 知 量 \alpha_{i}(i=3,4,...,m)都是已知量 αi(i=3,4,...,m)都是已知量,则约束条件变成: α 1 y 1 + α 2 y 2 = c = − ∑ i = 3 m α i y i , ( 0 ≤ α 1 ≤ C , 0 ≤ α 2 ≤ C ) \alpha_{1}y_{1}+\alpha_{2}y_{2}=c=-\sum_{i=3}^{m}\alpha_{i}y_{i},(0≤\alpha_{1}≤C,0≤\alpha_{2}≤C) α1y1+α2y2=c=−∑i=3mαiyi,(0≤α1≤C,0≤α2≤C) 两边同乘 y 1 y_{1} y1,并记 y 1 y 2 = h 0 y_{1}y_{2}=h_{0} y1y2=h0得: α 1 + h 0 α 2 = − y 1 ∑ i = 3 m α i y i = α 1 n e w + h 0 α 2 n e w \alpha_{1}+h_{0}\alpha_{2}=-y_{1}\sum_{i=3}^{m}\alpha_{i}y_{i}=\alpha_{1_{new}}+h_{0}\alpha_{2_{new}} α1+h0α2=−y1∑i=3mαiyi=α1new+h0α2new 令 H = − y 1 ∑ i = 3 m α i y i H=-y_{1}\sum_{i=3}^{m}\alpha_{i}y_{i} H=−y |
CopyRight 2018-2019 实验室设备网 版权所有 |