Research
Research Divisions
Research Progress
Achievements
Research Programs
Location: Home>Research>Research Progress
Generalized Learning Riemannian Space Quantization: A Case Study on Riemannian Manifold of SPD Matrices
Author: Update times: 2021-12-31                          | Print | Close | Text Size: A A A

Learning vector quantization (LVQ) is a simple and efficient classification method, enjoying great popularity. However, in many classification scenarios, such as electroencephalogram (EEG) classification, the input features are represented by symmetric positive-definite (SPD) matrices that live in a curved manifold rather than vectors that live in the flat Euclidean space. In this article, we propose a new classification method for data points that live in the curved Riemannian manifolds in the framework of LVQ. The proposed method alters generalized LVQ (GLVQ) with the Euclidean distance to the one operating under the appropriate Riemannian metric. We instantiate the proposed method for the Riemannian manifold of SPD matrices equipped with the Riemannian natural metric. Empirical investigations on synthetic data and real-world motor imagery EEG data demonstrate that the performance of the proposed generalized learning Riemannian space quantization can significantly outperform the Euclidean GLVQ, generalized relevance LVQ (GRLVQ), and generalized matrix LVQ (GMLVQ). The proposed method also shows competitive performance to the state-of-the-art methods on the EEG classification of motor imagery tasks.

 

This work is published on IEEE Transactions on Neural Networks and Learning Systems 32.1(2021):281-292.

Copyright © 2003 - 2013. Shenyang Institute of Automation (SIA), Chinese Academy of Sciences
All rights reserved. Reproduction in whole or in part without permission is prohibited.
Phone: 86 24 23970012 Email: siamaster@sia.cn