Re: [问题]实作linear svm

楼主: empireisme (empireisme)   2020-03-31 00:36:17
※ 引述《empireisme (empireisme)》之铭言:
: https://imgur.com/uImTpvF
: 各位大神好
: 小弟我最近在实作linear svm
: 使用的是台大李宏毅老师教的梯度下降法,去寻找weight
: 具体如下
: https://imgur.com/uImTpvF
: 但发现我的效果好像有点错
: https://imgur.com/j2uME6R
: 如图 support vector 应该要通过A点和B点才对 不知道哪里错了QQ
: 看看有没有大神可以帮忙找
: 这是我的程式码
: https://ideone.com/qg0zVC
: 我有想过是不是 W[i+1,j]<- W[i,j]+eta*sum(((y*(X%*%W[i,]))<1)*1 * y * X[,j] )
: eta前面的要换成 减号 但是换成减号反而错更多
: 检查蛮久了 想问问看大神们
找到BUG了
关键点在INTIAL GUESS 要都是0
w_intial <- rep(0,p)
以及我
abline((1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
abline((-1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
这边要加上括号
svm_gradient<- function(x,eta=0.001,R=10000){
X<- cbind(1,x)#make design matrix
n <- nrow(X) #number of sample
p <- ncol(X) #number of feature+1 (bias)
w_intial <- rep(0,p)
W <- matrix(w_intial ,nrow = R+1,ncol = p,byrow = T) #matrix put intial guess
and the procedure to do gradient descent
for(i in 1:R){
for(j in 1:p)
{
W[i+1,j]<- W[i,j]+eta*sum(((y*(X%*%W[i,]))<1)*1 * y * X[,j] )
}
}
return(W)
}
getsvm <- function(x){
w_answer<- svm_gradient(x)[nrow(svm_gradient(x)),]
return(w_answer )
}
set.seed(3)
n = 5
a1 = rnorm(n)
a2 = 1 - a1 + 2* runif(n)
b1 = rnorm(n)
b2 = -1 - b1 - 2*runif(n)
x = rbind(matrix(cbind(a1,a2),,2),matrix(cbind(b1,b2),,2))
y <- matrix(c(rep(1,n),rep(-1,n)))
plot(x,col=ifelse(y>0,4,2),pch=".",cex=3,xlab = "x1",ylab = "x2")
w_answer<- getsvm(x)
abline(-w_answer[1]/w_answer[3],-w_answer[2]/w_answer[3])
abline((1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
abline((-1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)

Links booklink

Contact Us: admin [ a t ] ucptt.com