A Model of Dual Fabry-Perot Etalon-Based External-Cavity Tunable Laser Us...
Internal motion within pulsating pure-quartic soliton molecules in a fibe...
Enhanced light emission of germanium light-emitting-diode on 150 mm germa...
The Fabrication of GaN Nanostructures Using Cost-Effective Methods for Ap...
Negative-to-Positive Tunnel Magnetoresistance in van der Waals Fe3GeTe2/C...
Quantum Light Source Based on Semiconductor Quantum Dots: A Review
A High-Reliability RF MEMS Metal-Contact Switch Based on Al-Sc Alloy
Development of a Mode-Locked Fiber Laser Utilizing a Niobium Diselenide S...
Development of Multiple Fano-Resonance-Based All-Dielectric Metastructure...
Traffic Vibration Signal Analysis of DAS Fiber Optic Cables with Differen...
官方微信
友情链接

Novel activation function with pixelwise modeling capacity for lightweight neural network design

2021-07-22

 

Author(s): Liu, Y (Liu, Yi); Guo, XZ (Guo, Xiaozhou); Tan, KJ (Tan, Kaijun); Gong, GL (Gong, Guoliang); Lu, HX (Lu, Huaxiang)

Source: CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE Article Number: e6350 DOI: 10.1002/cpe.6350 Early Access Date: JUL 2021

Abstract: The development of lightweight networks makes neural networks more efficient to be widely applied to various tasks. Considering the deployment of hardware like edge devices and mobile phones, we prioritize lightweight networks. However, their accuracy has always lagged far behind SOTA networks. In this article, we present a simple yet effective activation function, called WReLU, to improve the performance of lightweight networks significantly by adding a residual spatial condition. Moreover, we use a strategy to switch activation functions after determining which convolutional layer to use. We perform experiments on ImageNet 2012 classification dataset in CPU, GPU, and edge devices. Experiments demonstrate that WReLU improves the accuracy of classification significantly. Meanwhile, our strategy balances the effect of additional parameters and multiply accumulate. Our method improves the accuracy of SqueezeNet and SqueezeNext by more than 5% without increasing extensive parameters and computation. For the lightweight network with a large number of parameters, such as MobileNet and ShuffleNet, there is also a significant improvement. Additionally, the inference speed of most lightweight networks using our WReLU strategy is almost the same as the baseline model on different platforms. Our approach not only ensures the practicability of the lightweight network but also improves its performance.

Accession Number: WOS:000670174800001

ISSN: 1532-0626

eISSN: 1532-0634

Full Text: https://onlinelibrary.wiley.com/doi/10.1002/cpe.6350



关于我们
下载视频观看
联系方式
通信地址

北京市海淀区清华东路甲35号(林大北路中段) 北京912信箱 (100083)

电话

010-82304210/010-82305052(传真)

E-mail

semi@semi.ac.cn

交通地图
版权所有 中国科学院半导体研究所

备案号:京ICP备05085259-1号 京公网安备110402500052 中国科学院半导体所声明