Skip to main navigation Skip to search Skip to main content

Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for k-Means Clustering

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed stochastic backward Euler and the recent entropy stochastic gradient descent for improving the training of deep neural networks. Numerical experiments on various synthetic and real datasets show that the proposed algorithm provides better clustering results compared to k-means algorithms in the sense that it decreased the objective function (the cluster) and is much more robust to initialization.

Original languageEnglish
Pages (from-to)1133-1146
Number of pages14
JournalJournal of Scientific Computing
Volume77
Issue number2
DOIs
StatePublished - Nov 1 2018

Keywords

  • Backward Euler
  • Fixed-point iteration
  • Implicit gradient descent
  • Mini-batch gradient
  • k-Means

Fingerprint

Dive into the research topics of 'Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for k-Means Clustering'. Together they form a unique fingerprint.

Cite this