Measures of Central Tendency: Mean, Median and Mode
XCSF with Local Deletion: Preventing Detrimental Forgetting
1. XCSF with Local Deletion:Preventing Detrimental Forgetting Olivier Sigaud Institut des Systèmes Intelligents et de Robotique, Université Pierre et Marie Curie Paris 6. CNRS UMR 7222, 4 place Jussieu, F-75005 Paris, France olivier.sigaud@upmc.fr Martin V. Butz Department of Psychology III University of Würzburg Röntgenring 11, 97070 Würzburg, Germany butz@psychologie.uni-wuerzburg.de
2. Motivation Achieve the following goals: Maintain a complete solution Avoid detrimental forgetting Enable continuous learning with selective focus … particularly in problems where: the problem space is non-uniformly or non-independently sampled (not iid). the sub-space is not fully sampled (learning in manifolds). some problem subspaces need to be known (smaller error) better than others (selective learning).
3. Observation XCSF reproduces locally but deletes globally. This is good, because we generate a generalization pressure (local classifiers are on average more general). This is bad, however, because non-uniformly sampled problems can lead to forgetting. Thus, how can we delete locally and still generate the generalizationpressure?
4. Approach:Choose local candidates for deletion without dependency on their generality. Algorithm Select random classifier cl from [M]. [D] = for all c2[P] do if cl does match center of c then add c to candidate list [D] end if end for DELETE FROM CANDIDATE LIST [D]
5. The Two Evaluation Functions Crossed-Ridge Function Diagonal Sine Function
6. Evaluation with Different Sampling Types Normal: Uniform Sampling Random walk sampling: Next sample is located in radial vicinity of previous one Random walk sampling in ring (area of distance .3 to .4 of center) Centered, Gaussian sampling Ring-based Gaussian sampling Parameter Settings: N = 4000, ²0= 0.002
19. Summary & Conclusions Local deletion does not negatively affect performance. During condensation, local deletion can assure a better problem solution sustenance. Some of the results also indicate better structural development during learning. These results have been confirmed in various other settings. No apparent drawback to apply local deletion (constant overhead computationally) Use this mechanism also in other condition settings! Use it also to selectively learn higher accurate and lower accurate approximations in different problem subspaces!