A study on CNN image classification of EEG Signals represented in 2D and 3D

Abstract

Objective. The novelty of this study consists of the exploration of multiple new approaches of data pre-processing of brainwave signals, wherein statistical features are extracted and then formatted as visual images based on the order in which dimensionality reduction algorithms select them. This data is then treated as visual input for 2D and 3D convolutional neural networks (CNNs) which then further extract 'features of features'. Approach. Statistical features derived from three electroencephalography (EEG) datasets are presented in visual space and processed in 2D and 3D space as pixels and voxels respectively. Three datasets are benchmarked, mental attention states and emotional valences from the four TP9, AF7, AF8 and TP10 10-20 electrodes and an eye state data from 64 electrodes. Seven hundred twenty-nine features are selected through three methods of selection in order to form 27 × 27 images and 9 × 9 × 9 cubes from the same datasets. CNNs engineered for the 2D and 3D preprocessing representations learn to convolve useful graphical features from the data. Main results. A 70/30 split method shows that the strongest methods for classification accuracy of feature selection are One Rule for attention state and Relative Entropy for emotional state both in 2D. In the eye state dataset 3D space is best, selected by Symmetrical Uncertainty. Finally, 10-fold cross validation is used to train best topologies. Final best 10-fold results are 97.03% for attention state (2D CNN), 98.4% for Emotional State (3D CNN), and 97.96% for Eye State (3D CNN). Significance. The findings of the framework presented by this work show that CNNs can successfully convolve useful features from a set of pre-computed statistical temporal features from raw EEG waves. The high performance of K-fold validated algorithms argue that the features learnt by the CNNs hold useful knowledge for classification in addition to the pre-computed features.

Publication DOI: https://doi.org/10.1088/1741-2552/abda0c
Divisions: College of Engineering & Physical Sciences > School of Informatics and Digital Engineering > Computer Science
College of Engineering & Physical Sciences
College of Engineering & Physical Sciences > Aston Institute of Urban Technology and the Environment (ASTUTE)
College of Engineering & Physical Sciences > Systems analytics research institute (SARI)
Additional Information: ©2021 IOP Publishing Ltd. After the Embargo Period, the full text of the Accepted Manuscript may be made available on the non-commercial repository for anyone with an internet connection to read and download. After the Embargo Period a CC BY-NC-ND 3.0 licence applies to the Accepted Manuscript, in which case it may then only be posted under that CC BY-NC-ND licence provided that all the terms of the licence are adhered to, and any copyright notice and any cover sheet applied by IOP is not deleted or modified.
Uncontrolled Keywords: EEG classification,applied intelligence,data preprocessing,human-machine interaction,Biomedical Engineering,Cellular and Molecular Neuroscience
Full Text Link:
Related URLs: https://iopscie ... 741-2552/abda0c (Publisher URL)
http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Article
Published Date: 2021-04
Published Online Date: 2021-01-08
Accepted Date: 2021-01-08
Authors: Bird, Jordan J
Faria, Diego R (ORCID Profile 0000-0002-2771-1713)
Manso, Luis J (ORCID Profile 0000-0003-2616-1120)
Ayrosa, Pedro Paulo Da Silva
Ekart, Aniko (ORCID Profile 0000-0001-6967-5397)

Download

[img]

Version: Accepted Version

Access Restriction: Restricted to Repository staff only until 8 January 2022.

License: Creative Commons Attribution Non-commercial No Derivatives


Export / Share Citation


Statistics

Additional statistics for this record