A Model of Dual Fabry-Perot Etalon-Based External-Cavity Tunable Laser Us...
Internal motion within pulsating pure-quartic soliton molecules in a fibe...
Enhanced light emission of germanium light-emitting-diode on 150 mm germa...
The Fabrication of GaN Nanostructures Using Cost-Effective Methods for Ap...
Negative-to-Positive Tunnel Magnetoresistance in van der Waals Fe3GeTe2/C...
Quantum Light Source Based on Semiconductor Quantum Dots: A Review
A High-Reliability RF MEMS Metal-Contact Switch Based on Al-Sc Alloy
Development of a Mode-Locked Fiber Laser Utilizing a Niobium Diselenide S...
Development of Multiple Fano-Resonance-Based All-Dielectric Metastructure...
Traffic Vibration Signal Analysis of DAS Fiber Optic Cables with Differen...
官方微信
友情链接

A Cooperative Lightweight Translation Algorithm Combined with Sparse-ReLU

2022-07-19

 

Author(s): Xu, XT (Xu, Xintao); Liu, Y (Liu, Yi); Chen, G (Chen, Gang); Ye, JB (Ye, Junbin); Li, ZG (Li, Zhigang); Lu, HX (Lu, Huaxiang)

Source: COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE Volume: 2022 Article Number: 4398839 DOI: 10.1155/2022/4398839 Published: MAY 28 2022

Abstract: In the field of natural language processing (NLP), machine translation algorithm based on Transformer is challenging to deploy on hardware due to a large number of parameters and low parametric sparsity of the network weights. Meanwhile, the accuracy of lightweight machine translation networks also needs to be improved. To solve this problem, we first design a new activation function, Sparse-ReLU, to improve the parametric sparsity of weights and feature maps, which facilitates hardware deployment. Secondly, we design a novel cooperative processing scheme with CNN and Transformer and use Sparse-ReLU to improve the accuracy of the translation algorithm. Experimental results show that our method, which combines Transformer and CNN with the Sparse-ReLU, achieves a 2.32% BLEU improvement in prediction accuracy and reduces the number of parameters of the model by 23%, and the sparsity of the inference model increases by more than 50%.

Accession Number: WOS:000819224500011

PubMed ID: 35669640

Author Identifiers:

Author        Web of Science ResearcherID        ORCID Number

Liu, Yi                  0000-0003-3056-7713

Xu, Xintao                  0000-0002-3389-7518

ISSN: 1687-5265

eISSN: 1687-5273

Full Text: https://www.hindawi.com/journals/cin/2022/4398839/



关于我们
下载视频观看
联系方式
通信地址

北京市海淀区清华东路甲35号(林大北路中段) 北京912信箱 (100083)

电话

010-82304210/010-82305052(传真)

E-mail

semi@semi.ac.cn

交通地图
版权所有 中国科学院半导体研究所

备案号:京ICP备05085259-1号 京公网安备110402500052 中国科学院半导体所声明