Ngày tải lên :
24/04/2014, 12:29
... 2.
Support
Vector
Domain
Classifier
with
constrains
,
=
=1,
and
0
<
a,
<
C.
Where
the
2.1
Support
Vector
Domain
Description
[7]
inner
product
has
been
replaced
with
kernel
function
K(.,.),
and
K(.,.)
is
a
definite
kernel
satisfying
mercer
Of
a
data
setcontaiing
N
dataobj
condition,
for
example
a
popular
choice
is
the
Gaussian
Of
a
data
set
containing
N
data
objects,
enl
(,)=ep-xz2/2),>0
f
x,
Z
=
1, ... akYkXk
(I10)
(13)
in
formula
(10),
xk
represents
support
vector,
and
k
is
Finally
we
obtain
the
following
decision
function:
the
number
of
support
vector.
fk(x)
=sgntRk
-{K(x,x)
+2
E
a, y,K(x,X)
-ZE
a, ayjy,yjK(x,ix)}
If
f(x)
>
0,
the
tested
sample
is
contained
in
sphere,
,ESV
,ESV
and
we
look
the
samples
enclosed
I
sphere
the
same-class
sgn{R21
+
2Rkl
E
aoy1xi
+(
E
aciyiXi)2}
objects.
Otherwise
it
is
rejected,
and
we
look
it
as
the
Xi,SVk
xi,SVk
opposite
objects.
-{K(x,
x)
+
2
E
a1 yiK(x,
xi)-E
aa1jy1yjK(x
,
xj)}
xi
ESV
xiESV
3.
SVDC
Incremental
Learning
Algorithm
According
formula
(6),
we
suppose
the
obtained
initial
sgn{ffk
(x)
+
2Rk
L
E
aiy,x,
+
(
a
ciyixi)2}
parameter
(sphere
radius)
learning
with
initial
training
set
is
xi
csVk
xi
csVk
RO,
and
the
set
of
support
vectors
is
SVO
.
The
parameter
(14)
From
equation
(14)
we
can
see
it
is
easy
to
return
the
becomes
Rk
in
the
kth
incremental
learning,
and
the
set
last
step
of
incremental
earning
without
extra
computation.
of
support
vectors
becomes
SVk,
and
the
new
dataset
in
From
the
above
analysis
we
can
see
only
conduct
a
trifling
modification
on
the
standard
SVDC,
can
it
be
used
klh
step
becomes
Dk
=
{(xk
yk)j}l-
to
solve
the
updated
model
in
incremental
learning
procedure.
Our
incremental
algorithm
can
be
described
as
Nowwesummarizeouralgorithmasfollowings:
following:
Step
1
Learning
the
initial
concept:
training
SVDC
Assume
we
has
known
Rkl
updating
the
current
using
initial
datasetoTS
,
then
parameter
R0
is
model~~~~
~
~
usn
SVkn
lnXka
daae
TSo
I/hnpaaeerR
model
using
SJK,l1
and
new
dataset
{(X
iY7)}>=1
obtained;
We
updating
the
current
model
using
the
following
Step
2
Updating
the
current
concept:
when
the
new
data
are
available,
using
them
to
solve
QP
problem
quadratic
programming
(QP)
problem:
formula
(
11),
and
obtain
new
concept;
min
g(Rk)
I
Rk
-
R
112
Step
3
Repeating
step
2
until
the
incremental
learning
is
k
(Rk2
_(Xk
-
a) '
(XV
-a) )
>
Xk
exi
Dk
over
where
Rk-l
is
the
radius
of
last
optimization
problem
(11),
4.
Experiments
and
Results
when
k
=
1,
Ro
is
the
radius
of
standard
SVDC.
It
is
In
order
to
evaluate
the
learning
performance
offered
by
obvious,
when
RklI
=
0,
the
incremental
SVDC
has
the
our
incremental
algorithm,
we
conducted
experiment
on
six
different
datasets
taken
from
UCI
Machine
Repository:
same
form
as
the
standard
SVDC.
We
will
found
the
Banana,
Diabetes,
Flare-Solar,
Heart,
Breast-Cancer,
German.
updated
model
by
the
incremental
SVDC
also
owns
the
Note
some
of
then
are
not
binary
-class
classification
problems,
but
we
have
transform
them
to
binary-class
problem
by
special
property
of
solution
sparsity
which
is
owned
by
the
technique.
Experiment
parameters
and
Dataset
are
shown
in
standard
SVDC ... 9
10
[6]
L.
Baoqing.
Distance-based
selection
of
potential
support
vector
Incremental
Learning
Step
by
kernel
matrix.
In
International
symposium
on
Neural
(f)
Networks
2004,
LNCS
3173,pp.
468-473,2004
Fig.
2.
Performance
of
two
incremental
learning
algorithms
[7]
D.
Tax.:
One-class
classification.
Ph
D
thesis,
Delft
University
of
From
figure
2
we
can
see
after
each
step
of
incremental
Technology,
htp://www.phtn.tudelft.nl/-davidt/thesispdf
(2001)
training,
the
variation
of
the
predication
accuracy
on
the
test
set
is
not
various,
which
satisfy
the
requirement
of
algorithm
[8]
N
A
Syed,
H
Liu,
K
Sung.
From
incremental
learning
to
model
stability.,
and
we
can
discovery
the
algorithm
improvement
is
independent
instance
selection
-
a
support
vector
machine
gradually
improved
and
algorithm
and
the
algorithm
own
the
approach,
Technical
Report,
TRA9/99,
NUS,
1999
ability
of
performance
recoverability.
So
our
incremental
ablgoithmo
perfopo
ned
inrthisoperabmeets
the
duriremand
l
o
[9]
L
Yangguang,
C
Qi,
T
yongchuan
et
al.
Incremental
updating
method
for
support
vector
machine,
Apweb2004,
LNCS
3007,
incremental
learnig.
pp.
426-435,
2004.
The
experiment
results
show,
our
algorithm
has
the
similar
learning
performance
compared
with
the
popular
[10]
S
R
Gunn.
Support
vector
machines
for
classification
and
ISVM
algorithm
presented
in
[9].
Another
discovery
in
our
regression.
Technical
Report,
Inage
Speech
and
Intelligent
experiment
is
with
the
gradually
performing
of
our
Systems
Research
Group,
University
of
Southampton,
1997
incremental
learning
algorithm,
the
improvement
of
learning
performance
become
less
and
less,
and
at
last
,
the
learning
performance
no
longer
improve.
It
indicates
that
we
can
estimate
the
needed
number
of
samples
required
in
problem
description
by
using
this
character.
5.
Conclusion
In
this
paper
we
proposed
an
incremental
learning
algorithm
based
on
support
vector
domain
classifier
(SVDC),
and
its
key
idea
is
to
obtain
the
initial
concept
using
standard
SVDC,
then
using
the
updating
technique
presented
in
this
paper,
in
fact
which
equals
to
solve
a
QP
problem
similar
to
that
existing
in
standard
SVDC
algorithm
solving.
Experiments
show
that
our
algorithm
is
effective
and
promising.
Others
characters
of
this
algorithm
include:
updating
model
has
similar
mathematics
form
compared
with
standard
SVDC,
and
we
can
acquire
the
sparsity
expression
of
its
solutions,
meanwhile
using
this
algorithm
can
return
last
step
without
extra
computation,
furthermore,
this
algorithm
can
be
used
to
estimate
the
needed
number
of
samples
required
in
problem
description
REFERENCES
[1]
C.
Cortes,
V.
N.
Vapnik.:
Support
vector
networks,
Mach.
Learn.
20
(1995)
pp.
273-297.
[2]
.V.
N.
Vapnik.:
Statistical
learning
Theory,
Wiley,
New
York,
1998.
809
...