[Submitted on 13 May 2023]

Download PDF

Abstract: Automatic differentiation (AD) is a range of algorithms to compute the
numeric value of a function’s (partial) derivative, where the function is
typically given as a computer program or abstract syntax tree. AD has become
immensely popular as part of many learning algorithms, notably for neural
networks. This paper uses Prolog to systematically derive gradient-based
forward- and reverse-mode AD variants from a simple executable specification:
evaluation of the symbolic derivative. Along the way we demonstrate that
several Prolog features (DCGs, co-routines) contribute to the succinct
formulation of the algorithm. We also discuss two applications in probabilistic
programming that are enabled by our Prolog algorithms. The first is parameter
learning for the Sum-Product Loop Language and the second consists of both
parameter learning and variational inference for probabilistic logic
programming.

Submission history

From: Tom Schrijvers [view email]



[v1]

Sat, 13 May 2023 09:34:53 UTC (41 KB)

Read More