Convolutional neural network (Cnn)-based frame synchronization method
- Authors
- Jeong, Eui-Rim; Lee, Eui-Soo; Joung, Jingon; Oh, Hyukjun
- Issue Date
- Oct-2020
- Publisher
- MDPI AG
- Keywords
- 2D transformation; CNN; Deep learning; Frame synchronization; Synchronized communication networks
- Citation
- Applied Sciences (Switzerland), v.10, no.20, pp 1 - 11
- Pages
- 11
- Journal Title
- Applied Sciences (Switzerland)
- Volume
- 10
- Number
- 20
- Start Page
- 1
- End Page
- 11
- URI
- https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/52073
- DOI
- 10.3390/app10207267
- ISSN
- 2076-3417
2076-3417
- Abstract
- A new frame synchronization technique based on convolutional neural network (CNN) is proposed for synchronized networks. To estimate the exact packet arrival time, the receiver typically uses the correlator between the received signal and the preamble or pilot in front of the transmitted packet. The conventional frame synchronization technique searches the correlation peak within the time window. In contrast, the proposed method utilizes a CNN to find the packet arrival time. Specifically, in the proposed method, the 1D correlator output is converted into a 2D matrix by reshaping, and the resulting signal is inputted to the proposed 4-layer CNN classifier. Then, the CNN predicts the packet arrival time. To verify the frame synchronization performance, computer simulation is performed for two channel models: additive white Gaussian noise and fading channels. Simulation results show that the proposed CNN-based synchronization method outperforms the conventional correlation-based technique by 2 dB. © 2020 by the authors. Licensee MDPI, Basel, Switzerland.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of ICT Engineering > School of Electrical and Electronics Engineering > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/52073)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.