This page collects some things I came across and find interesting
The Differentiable Neural Computer: while RNNs can already perform computation to a minor extent, this work (and earlier related models such as the Neural GPU) opens up a whole new realm of neural computational models that begin to move towards the computational power of regular computer programs. Very exciting to see how this area will develop further.
Fast Weights by Jimmy Ba, Geoffrey Hinton, Volodymyr Mnih, Joel Z. Leibo, Catalin Ionescu: this greatly increases the capacity of RNNs for holding internal state as it makes use of the connections for holding state rather than the units
Wavenet: generative music model at the raw audio waveform level Especially the piano music output (1 2 3 4 5 6 ), where a full audio image is generated by the network, capturing structure of the input from detailed sound waves to long-range musical structures, is fascinating!
Chris Olah’s blog on Neural Networks
Andrej Karpathy’s page on the unreasonable effectiveness of recurrent neural networks; see also his main website
Hypernetworks by Hardmaru: an embedded LSTM generating LSTM weights
Microsoft’s concept graph
A talk by Yoshua Bengio on work (see also this article) aimed at bridging the gap between neural networks research and theoretical neuro-science by developing neural network algorithms (alternatives for backpropagation) that resemble spike-timing dependent plasticity
Neural machine translation