[Submitted on 10 Nov 2021 (v1), last revised 21 Jan 2022 (this version, v2)]

Download PDF

Abstract: Differentiable programming techniques are widely used in the community and
are responsible for the machine learning renaissance of the past several
decades. While these methods are powerful, they have limits. In this short
report, we discuss a common chaos based failure mode which appears in a variety
of differentiable circumstances, ranging from recurrent neural networks and
numerical physics simulation to training learned optimizers. We trace this
failure to the spectrum of the Jacobian of the system under study, and provide
criteria for when a practitioner might expect this failure to spoil their
differentiation based optimization algorithms.

Submission history

From: Luke Metz [view email]

Wed, 10 Nov 2021 16:51:04 UTC (706 KB)

Fri, 21 Jan 2022 02:07:54 UTC (742 KB)

Read More