How to use a Hopfield recurrent neural network trained on natural images to perform state-of-the-art image compression

  • Christopher Hillar (UC Berkeley, Redwood Center For Theoretical Neuroscience, USA)
A3 01 (Sophus-Lie room)


The Hopfield network is a well-known model of memory and collective processing in networks of abstract neurons, but it has been dismissed for use in signal processing because of its small pattern capacity, difficulty to train, and lack of practical applications. In the last few years, however, it has been demonstrated that exponential storage is possible for special classes of patterns and network connectivity structures. Over the same time period, advances in training large-scale networks have also appeared. Here, we train Hopfield networks on discretizations of grayscale digital photographs using a learning technique called minimum probability flow (MPF).

After training, we demonstrate that these networks have exponential memory capacity, allowing them to perform state-of-the-art image compression in the high quality regime. Our findings suggest that the local structure of images is remarkably well-modeled by a binary recurrent neural network.

(Joint work with Ram Mehta and Kilian Koepsell).