The problem of path planning is usually solved with classical planning algorithms. A recent study on arXiv.org proposes a machine learning approach for this task.
Transformers are used to capture the long-term spatial relationship. The idea of the Spatial Planning Transformers (SPT) consisting of attention heads that can attend to any part of the input is proposed.
An end-to-end differentiable framework, which has the structure of mapper and planner built into it, is formulated. The SPT planner is pre-trained to capture a data-driven prior and then backpropagate through it to learn a mapper that maps raw observations to an obstacle map. That enables learning without supervision or interaction. The learned mapper and planner can also generalize to unseen maps.
It is shown that SPT outperforms both classical algorithms and learning-based methods on both manipulation and navigation tasks.
We consider the problem of spatial path planning. In contrast to the classical solutions which optimize a new plan from scratch and assume access to the full map with ground truth obstacle locations, we learn a planner from the data in a differentiable manner that allows us to leverage statistical regularities from past data. We propose Spatial Planning Transformers (SPT), which given an obstacle map learns to generate actions by planning over long-range spatial dependencies, unlike prior data-driven planners that propagate information locally via convolutional structure in an iterative manner. In the setting where the ground truth map is not known to the agent, we leverage pre-trained SPTs in an end-to-end framework that has the structure of mapper and planner built into it which allows seamless generalization to out-of-distribution maps and goals. SPTs outperform prior state-of-the-art differentiable planners across all the setups for both manipulation and navigation tasks, leading to an absolute improvement of 7-19%.
Research paper: Singh Chaplot, D., Pathak, D., and Malik, J., “Differentiable Spatial Planning using Transformers”, 2021. Link: https://arxiv.org/abs/2112.01010
Discover more from Today Headline
Subscribe to get the latest posts to your email.