Skip to main navigation Skip to search Skip to main content

Binaryrelax: A relaxation approach for training deep neural networks with quantized weights

  • Penghang Yin
  • , Shuai Zhang
  • , Jiancheng Lyu
  • , Stanley Osher
  • , Yingyong Qi
  • , Jack Xin

Research output: Contribution to journalArticlepeer-review

57 Scopus citations

Abstract

We propose BinaryRelax, a simple two-phase algorithm, for training deep neural networks with quantized weights. The set constraint that characterizes the quantization of weights is not imposed until the late stage of training, and a sequence of pseudo quantized weights is maintained. Specifically, we relax the hard constraint into a continuous regularizer via a Moreau envelope, which turns out to be the squared Euclidean distance to the set of quantized weights. The pseudo quantized weights are obtained by linearly interpolating between the float weights and their quantizations. A continuation strategy is adopted to push the weights toward the quantized state by gradually increasing the regularization parameter. In the second phase, an exact quantization scheme with a small learning rate is invoked to guarantee fully quantized weights. We test BinaryRelax on the benchmark CIFAR and ImageNet color image datasets to demonstrate the superiority of the relaxed quantization approach and the improved accuracy over the state-of-the-art training methods. Finally, we prove the convergence of BinaryRelax under an approximate orthogonality condition.

Original languageEnglish
Pages (from-to)2205-2223
Number of pages19
JournalSIAM Journal on Imaging Sciences
Volume11
Issue number4
DOIs
StatePublished - 2018

Keywords

  • BinaryRelax
  • Continuous relaxation
  • Deep neural networks
  • Quantization

Fingerprint

Dive into the research topics of 'Binaryrelax: A relaxation approach for training deep neural networks with quantized weights'. Together they form a unique fingerprint.

Cite this