The convergence rate of linearly separable SMO.

IJCNN(2013)

引用 1|浏览1
暂无评分
摘要
It is well known that the dual function value sequence generated by SMO has a linear convergence rate when the kernel matrix is positive definite and sublinear convergence is also known to hold for a general matrix. In this paper we will prove that, when applied to hard-margin, i.e., linearly separable SVM problems, a linear convergence rate holds for the SMO algorithm without any condition on the kernel matrix. Moreover, we will also show linear convergence for the multiplier sequence generated by SMO, the corresponding weight vectors and the KKT gap usually applied to control the number of SMO iterations. This gives a fairly complete picture of the convergence of the various sequences SMO generates. While linear SMO convergence for the general SVM L-1 soft margin problem is still open, the approach followed here may lead to such a general result.
更多
查看译文
关键词
convergence,iterative methods,matrix algebra,optimisation,KKT gap,SMO algorithm,SMO iteration,SVM soft margin problem,definite convergence,dual function value sequence,general matrix,kernel matrix,linear SMO convergence,linear convergence rate,linearly separable SMO,linearly separable SVM problems,multiplier sequence,sublinear convergence,weight vectors
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要