Jason to present DAB

A couple of months ago Jason Ramapuram interned in Apple Machine Learning Research. Among other things, he worked with Russ Webb on a  novel method allowing for the use of simple non-differentiable functions at intermediary layers of deep neural networks. The outcome of these efforts, the Differential Approximation Bridges (DAB), are new neural network components that approximate the non-differentiable forward functions and provide gradient updates for backpropagation improving over existing gradient estimators up to 77%. To learn more about DAB you can read Jason's paper or listen to him explaining the method during the IJCNN regular session on Monday, July 20 at 5:45PM BST.