- Globale Ableitungen
Deussen, Jens; Naumann, Uwe (Thesis advisor); Mitsos, Alexander (Thesis advisor)
Aachen : RWTH Aachen University (2021, 2022)
Dissertation / PhD Thesis
Dissertation, RWTH Aachen University, 2021
The computation of derivative information of numerical simulations is an important task for the quantification of parameter sensitivities and for optimization purposes. Algorithmic differentiation (AD) methods offer exact derivatives at low implementation overhead for users and high maintainability of the computer program. Recursive application of AD methods enables the computation of higher derivatives. This thesis demonstrates how to efficiently use higher-order AD models to compute higher derivatives by exploiting symmetry and sparsity. Graph coloring algorithms are applied for this purpose. While vanilla AD methods compute derivative information that is only valid locally at a specified point, this thesis proposes methods to obtain a guaranteed enclosure of the derivative information on a specified domain. These enclosures are called global derivatives. The globalization of the derivative information can be achieved by application of interval arithmetic, e.g., the natural interval extension. Naive interval computations are prone to overestimation of actual value ranges. Special cases are identified for which the natural interval extension applied to the AD methods compute exact value ranges for the global derivatives. Furthermore, better converging methods as alternatives with the mean value form and McCormick relaxations of the AD modes are offered. Two applications that benefit from global derivatives are provided: deterministic global optimization by branch-and-bound methods, and significance-driven unreliable and approximate computing. Within the framework of the global optimization case study, subdomain separability is introduced. This local property enables the partitioning of the optimization problem on subdomains that fulfill a certain monotonicity condition. The usage of global derivatives as well as the application of subdomain separability accelerate the convergence speed of the global solver enormously for the presented problems. A second case study demonstrates how to automatically prune artificial neural networks by using significance values. The results illustrate the benefit of global derivatives for approximate computing.
- DOI: 10.18154/RWTH-2021-11739
- RWTH PUBLICATIONS: RWTH-2021-11739