000146260 001__ 146260
000146260 005__ 20180317093250.0
000146260 037__ $$aREP_WORK
000146260 245__ $$aThe more you learn, the less you store: memory\--controlled incremental SVM
000146260 269__ $$a2006
000146260 260__ $$bIDIAP$$c2006
000146260 336__ $$aReports
000146260 520__ $$aThe capability to learn from experience is a key property for a visual recognition algorithm working in realistic settings. This paper presents an SVM-based algorithm, capable of learning model representations incrementally while keeping under control memory requirements. We combine an incremental extension of SVMs with a method reducing the number of support vectors needed to build the decision function without any loss in performance, introducing a parameter which permits a user-set trade-off between performance and memory. The resulting algorithm is guaranteed to achieve the same recognition results as the original incremental method while reducing the memory growth. Moreover, experiments in two domains of material and place recognition show the possibility of a consistent reduction of memory requirements with only a moderate loss in performance. For example, results show that when the user accepts a reduction in recognition rate of 5%, this yields a memory reduction of up to 50%.
000146260 700__ $$aPronobis, Andrzej
000146260 700__ $$0243991$$aCaputo, Barbara$$g190271
000146260 8564_ $$uhttp://publications.idiap.ch/downloads/reports/2006/pronobis-idiap-rr-06-51.pdf$$zURL
000146260 8564_ $$s1409199$$uhttps://infoscience.epfl.ch/record/146260/files/pronobis-idiap-rr-06-51.pdf$$zn/a
000146260 909CO $$ooai:infoscience.tind.io:146260$$preport$$pSTI
000146260 909C0 $$0252189$$pLIDIAP$$xU10381
000146260 937__ $$aLIDIAP-REPORT-2006-020
000146260 970__ $$apronobis:rr06-51/LIDIAP
000146260 973__ $$aEPFL$$sPUBLISHED
000146260 980__ $$aREPORT