A couple of months ago Jason Ramapuram interned in Apple Machine Learning Research. Among other things, he worked with Russ Webb on a novel method allowing for the use of simple non-differentiable functions at intermediary layers of deep neural networks. The outcome of these efforts, the Differential Approximation Bridges (DAB), are new neural network components that approximate the non-differentiable forward functions and provide gradient updates for backpropagation improving over existing gradient estimators up to 77%. To learn more about DAB you can read Jason's paper or listen to him explaining the method during the IJCNN regular session on Monday, July 20 at 5:45PM BST.
-
Archives
- September 2024
- August 2024
- January 2024
- December 2023
- September 2023
- June 2023
- April 2023
- March 2023
- January 2023
- December 2022
- November 2022
- October 2022
- August 2022
- June 2022
- May 2022
- April 2022
- February 2022
- December 2021
- September 2021
- August 2021
- April 2021
- March 2021
- February 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- May 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- May 2019
- April 2019
- November 2018
- October 2018
- August 2018
- June 2018
- May 2018
- February 2018
- January 2018
- November 2017
-
Meta