Parallelization of Recurrent Neural Network-Based Equalizer for Coherent Optical Systems via Knowledge Distillation

Abstract

The recurrent neural network (RNN)-based equalizers, especially the bidirectional long-short-term memory (biLSTM) structure, have already been proven to outperform the feed-forward NNs in nonlinear mitigation in coherent optical systems. However, the recurrent connections still prevent the computation from being fully parallelizable. To circumvent the non-parallelizability of recurrent-based equalizers, we propose, for the first time, knowledge distillation (KD) to recast the biLSTM into a parallelizable feed-forward 1D-convolutional NN structure. In this work, we applied KD to the cross-architecture regression problem, which is still in its infancy. We highlight how the KD helps the student's learning from the teacher in the regression problem. Additionally, we provide a comparative study of the performance of the NN-based equalizers for both the teacher and the students with different NN architectures. The performance comparison was carried out in terms of the Q-factor, inference speed, and computational complexity. The equalization performance was evaluated using both simulated and experimental data. The 1D-CNN outperformed other NN types as a student model with respect to the Q-factor. The proposed 1D-CNN showed a significant reduction in the inference time compared to the biLSTM while maintaining comparable performance in the experimental data and experiencing only a slight degradation in the Q-factor in the simulated data.

Publication DOI: https://doi.org/10.1109/jlt.2023.3337604
Divisions: College of Engineering & Physical Sciences > Aston Institute of Photonics Technology (AIPT)
College of Engineering & Physical Sciences
Funding Information: This work has received funding from the EU Horizon 2020 program under the Marie Skłodowska-Curie grant agreement No. 956713 (MEN- TOR). SKT acknowledges the support of the EPSRC project TRANSNET (EP/R035342/1). Bernhard Spinnler, Nelson Costa, Antonio Nap
Additional Information: This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0
Uncontrolled Keywords: Artificial intelligence,machine learning,recurrent neural networks,parallelization,knowledge distillation,nonlinear equalizer,coherent detection
Publication ISSN: 0733-8724
Last Modified: 21 Jun 2024 07:30
Date Deposited: 15 Dec 2023 12:28
Full Text Link:
Related URLs: https://ieeexpl ... cument/10333336 (Publisher URL)
http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Article
Published Date: 2023-11-29
Published Online Date: 2023-11-29
Accepted Date: 2023-11-01
Authors: Srivallapanondh, Sasipim
Freire, Pedro J. (ORCID Profile 0000-0003-3145-1018)
Spinnler, Bernhard
Costa, Nelson
Napoli, Antonio
Turitsyn, Sergei K. (ORCID Profile 0000-0003-0101-3834)
Prilepsky, Jaroslaw E. (ORCID Profile 0000-0002-3035-4112)

Download

[img]

Version: Accepted Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Statistics

Additional statistics for this record