top of page
Search
  • Writer's pictureSANGMIN LIM

[PAPER REVIEW] Neural ODE - 1

Updated: Jan 20, 2022

Neural Ordinary Differential Equation - Chen et.al


This paper proposes a smart application of Blackbox ODE solver with adjoint sensitivity method to change discrete composition of hidden states for methods such as residual networks (ResNet) , recurrent neural network decoders (RNN), and normalizing flow to a continuous relationship between hidden states without defining discrete sequence of hidden states.


Essentially, to the core, I think their major contribution on the existing Neural Network Communities is to provide the community with a differentiable method for a blackbox ODE solver which has good compatibility with existing machine learning API, PyTorch.


It may sound complicated so far but we will break it down into examples and cases.


For the methods explained above in bold, they often use a form ,


to build complicated transformations by composing a sequence of transformations to a hidden state. The authors of paper smartly made connection to this structure with Euler discretization for numerics. Essentially, since this is a discretization form of ODE. Therefore, their thought process was probably, "Why not use the blackbox ODE solvers? when they are available",


In order to implement this thought process to the neural network, it was necessary for them to make sure that the ODE solvers are differentiable (due to back propagation). Then they connected adjoint sensitivity method.


The results are impressive. However, this paper has its own limitation based on their smart connection.

8 views0 comments

Recent Posts

See All

Stephen Wolfram

https://www.stephenwolfram.com/ This person is a living genius who makes his contribution as a researcher and a scientist in an...

Comments


bottom of page