Abstract
Many works have recently explored Sim-to-real transferable visual model predictive control (MPC). However, such works are limited to one-shot transfer, where real-world data must be collected once to perform the sim-to-real transfer, which remains a significant human effort of transferring the models learned in simulations to new domains in the real world. To alleviate these problems, we first propose a novel model-learning framework called Kalman Randomized-to-Canonical Model (KRC-model). This framework is capable of extracting task-relevant intrinsic features and their dynamics from randomized images. We then propose Kalman Randomized-to-Canonical Model Predictive Control (KRC-MPC) as a zero-shot sim-to-real transferable visual MPC using KRC-model. The effectiveness of our method is evaluated through valve rotation tasks by a robot hand in both simulation and the real world, and the block mating tasks in simulation. The experimental results show that KRC-MPC can be applied to various real domains and tasks in a zero-shot manner.