Talk Sessions:



Poster Sessions:



June 22, Booth 12

June 24, Booth 12

Neural Network Heuristic Functions for Classical Planning: Bootstrapping and Comparison to Other Methods

Patrick Ferber, Florian Gei├čer, Felipe Trevizan, Malte Helmert and Joerg Hoffmann

Abstract: How can we train neural network (NN) heuristic functions for classical planning, using only states as the NN input? Prior work addressed this question by (a) per-instance imitation learning and/or (b) per-domain learning. The former limits the approach to instances small enough for training data generation, the latter to domains where the necessary knowledge generalizes across instances. Here we explore three methods for (a) that make training data generation scalable through bootstrapping and approximate value iteration. In particular, we introduce a new bootstrapping variant that estimates search effort instead of goal distance, which as we show converges to the perfect heuristic under idealized circumstances. We empirically compare these methods to (a) and (b), aligning three different NN heuristic function learning architectures for cross-comparison in an experiment of unprecedented breadth in this context. Key lessons are that our methods and imitation learning are highly complementary; that per-instance learning often yields stronger heuristics than per-domain learning; and the LAMA planner is still dominant but our methods outperform it in one benchmark domain.

*This password protected talk video will only be available after it was presented at the conference.