Automatic differentiation (AD) or algorithmic differentiation is a set of techniques used for computing exact numerical derivatives of user-provided code, ideally without requiring invasive modifications. Julia, a recently developed high-level language for scientific computing, has a number of technical advancements, namely just-in-time compilation, metaprogramming, and multiple dispatch, that can change the way AD is implemented in practice, making it both more user friendly and efficient. We present an implementation of forward- and reverse-mode AD for computing exact sparse second-order derivatives that is now integrated into JuMP, an open-source modeling language embedded in Julia and capable of expressing and solving nonlinear programming problems with connections to state-of-the-art solvers such as Ipopt.
Automatic Differentiation in Julia
Presenter:
                      Miles
              Lubin
      
  Profile Link:
                      
              University:
                      Massachusetts Institute of Technology
              Program:
                      CSGF
              Year:
                      2014
              Program Review:
                      
              