Ngày tải lên :
24/04/2014, 12:29
...
table
1.
For
notation
simplicity,
in
figure
2,
our
algorithm
was
abbreviate
as
Our
ISVM.
In
order
to
solve
(11),
we
transform
it
to
its
dual
The
experiment
parameters
are
listed
in
table
1.
In
addition
to
conducting
experiments
with
our
algorithm,
we
problem,
and
introduce
Lagrangian:
also
implemented
and
tested
another
popular
and
effective
L='
R
-
R
-
-
(k-
-
-
incremental
learning
algorithm
ISVM
[8][9]
on
the
same
2
a)(xk
L=J'k"k
datasets
so
that
compare
their
learning
performance
in
our
(12)
experiment
we
choose
RBF
K(x,y)=exp(
2
)
as
kernel
807
Heart
[3]
T.
Joachims.:
Text
categorization
with
support
vector
machines:
100
ISVM
learing
with
many
relevant
features,
Proceedings
of
the
European
Conference
on
Machine
Learning,
Springer,
Berlin,
1998,
pp.
90
137-142
85
o-° ... 9
10
[6]
L.
Baoqing.
Distance-based
selection
of
potential
support
vector
Incremental
Learning
Step
by
kernel
matrix.
In
International
symposium
on
Neural
(f)
Networks
2004,
LNCS
3173,pp.
468-473,2004
Fig.
2.
Performance
of
two
incremental
learning
algorithms
[7]
D.
Tax.:
One-class
classification.
Ph
D
thesis,
Delft
University
of
From
figure
2
we
can
see
after
each
step
of
incremental
Technology,
htp://www.phtn.tudelft.nl/-davidt/thesispdf
(2001)
training,
the
variation
of
the
predication
accuracy
on
the
test
set
is
not
various,
which
satisfy
the
requirement
of
algorithm
[8]
N
A
Syed,
H
Liu,
K
Sung.
From
incremental
learning
to
model
stability.,
and
we
can
discovery
the
algorithm
improvement
is
independent
instance
selection
-
a
support
vector
machine
gradually
improved
and
algorithm
and
the
algorithm
own
the
approach,
Technical
Report,
TRA9/99,
NUS,
1999
ability
of
performance
recoverability.
So
our
incremental
ablgoithmo
perfopo
ned
inrthisoperabmeets
the
duriremand
l
o
[9]
L
Yangguang,
C
Qi,
T
yongchuan
et
al.
Incremental
updating
method
for
support
vector
machine,
Apweb2004,
LNCS
3007,
incremental
learnig.
pp.
426-435,
2004.
The
experiment
results
show,
our
algorithm
has
the
similar
learning
performance
compared
with
the
popular
[10]
S
R
Gunn.
Support
vector
machines
for
classification
and
ISVM
algorithm
presented
in
[9].
Another
discovery
in
our
regression.
Technical
Report,
Inage
Speech
and
Intelligent
experiment
is
with
the
gradually
performing
of
our
Systems
Research
Group,
University
of
Southampton,
1997
incremental
learning
algorithm,
the
improvement
of
learning
performance
become
less
and
less,
and
at
last
,
the
learning
performance
no
longer
improve.
It
indicates
that
we
can
estimate
the
needed
number
of
samples
required
in
problem
description
by
using
this
character.
5.
Conclusion
In
this
paper
we
proposed
an
incremental
learning
algorithm
based
on
support
vector
domain
classifier
(SVDC),
and
its
key
idea
is
to
obtain
the
initial
concept
using
standard
SVDC,
then
using
the
updating
technique
presented
in
this
paper,
in
fact
which
equals
to
solve
a
QP
problem
similar
to
that
existing
in
standard
SVDC
algorithm
solving.
Experiments
show
that
our
algorithm
is
effective
and
promising.
Others
characters
of
this
algorithm
include:
updating
model
has
similar
mathematics
form
compared
with
standard
SVDC,
and
we
can
acquire
the
sparsity
expression
of
its
solutions,
meanwhile
using
this
algorithm
can
return
last
step
without
extra
computation,
furthermore,
this
algorithm
can
be
used
to
estimate
the
needed
number
of
samples
required
in
problem
description
REFERENCES
[1]
C.
Cortes,
V.
N.
Vapnik.:
Support
vector
networks,
Mach.
Learn.
20
(1995)
pp.
273-297.
[2]
.V.
N.
Vapnik.:
Statistical
learning
Theory,
Wiley,
New
York,
1998.
809
2.
Support
Vector
Domain
Classifier
with
constrains
,
=
=1,
and
0
<
a,
<
C.
Where
the
2.1
Support
Vector
Domain
Description
[7]
inner
product
has
been
replaced
with
kernel
function
K(.,.),
and
K(.,.)
is
a
definite
kernel
satisfying
mercer
Of
a
data
setcontaiing
N
dataobj
condition,
for
example
a
popular
choice
is
the
Gaussian
Of
a
data
set
containing
N
data
objects,
enl
(,)=ep-xz2/2),>0
f
x,
Z
=
1, ... 6
7
8
9
10
Literature
[8]
points
out
an
efficient
incremental
(ncremental
Learning
Step
learning
algorithm
should
satisfies
the
following
three
(c)
Flare-Solar
criterions:
100
A.
Stability:
When
each
step
of
incremental
learning
is
95
Our
ISVM
over,
the
predication
accuracy
on
the
test
should
not
vary
90
too
obviously;
85
B.
Improvement:
With
the
performing
of
the
80
75
incremental
learning,
the
algorithm
s
predication
accuracy
should
improve
gradually;
r_0
C.
Recoverability:
The
incremental
learning
algorithm
should
own
the
ability
of
performance
recoverability,
that
is
to
say
when
the
learning
performance
l
of
the
algorithm
descends
after
a
certain
step
learning,
the
Incremental
Learning
Step
algorithm
can
recovers
even
surpasses
the
former
(d)
performance
in
the
later
learning
procedure.
German
Figure
2
shows
the
experiment
results
of
the
two
'°°
l,vm
different
incremental
learning
algorithms.
90
~~~~~~~~~~~~~~~~~~~~~~~~~~~~9
Banana
100~~~~0
100
r
1
T
1...