Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Log-quantization on GRU networks

Authors
Park, Sang-KiPark, Sang-SooChung, Ki Seok
Issue Date
Nov-2018
Publisher
Association for Computing Machinery
Keywords
AI; CNN; FPGA; HLS; HW/SW Co-Design; LeNet-5; SDSoC
Citation
ACM International Conference Proceeding Series, pp.112 - 116
Indexed
SCOPUS
Journal Title
ACM International Conference Proceeding Series
Start Page
112
End Page
116
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/5240
DOI
10.1145/3290420.3290443
ISSN
0000-0000
Abstract
Today, recurrent neural network (RNN) is used in various applications like image captioning, speech recognition and machine translation. However, because of data dependencies, recurrent neural network is hard to parallelize. Furthermore, to increase network’s accuracy, recurrent neural network uses complicated cell units such as long short-term memory (LSTM) and gated recurrent unit (GRU). To run such models on an embedded system, the size of the network model and the amount of computation need to be reduced to achieve low power consumption and low required memory bandwidth. In this paper, implementation of RNN based on GRU with a logarithmic quantization method is proposed. The proposed implementation is synthesized using high-level synthesis (HLS) targeting Xilinx ZCU102 FPGA running at 100MHz. The proposed implementation with an 8-bit log-quantization achieves 90.57% accuracy without re-training or fine-tuning. And the memory usage is 31% lower than that for an implementation with 32-bit floating point data representation.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 융합전자공학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Chung, Ki Seok photo

Chung, Ki Seok
COLLEGE OF ENGINEERING (SCHOOL OF ELECTRONIC ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE