Abstract

We propose VRGym, a virtual reality testbed for realistic human-robot interaction. Different from existing toolkits and virtual reality environments, the VRGym emphasizes on building and training both physical and interactive agents for robotics, machine learning, and cognitive science. VRGym leverages mechanisms that can generate diverse 3D scenes with high realism through physics-based simulation. We demonstrate that VRGym is able to (i) collect human interactions and fine manipulations, (ii) accommodate various robots with a ROS bridge, (iii) support experiments for human-robot interaction, and (iv) provide toolkits for training the state-of-the-art machine learning algorithms. We hope VRGym can help to advance general-purpose robotics and machine learning agents, as well as assisting human studies in the field of cognitive science.}

Paper

VRGym: A Virtual Testbed for Physical and Interactive AI
Xu Xie*, Hangxin Liu*, Zhenliang Zhang, Yuxing Qiu, Feng Gao, Siyuan Qi, Yixin Zhu, Song-Chun Zhu
Association for Computing Machinery Turing Celebration Conference (ACM TURC), 2019
(* indicates equal contribution.)
Paper / Supplementary / Demo

Team

Xu Xie1

Hangxin Liu1

Zhenliang Zhang1

Yuxing Qiu1

Feng Gao1

Siyuan Qi1

Yixin Zhu1

Song-Chun Zhu1

1 UCLA Center for Vision, Cognition, Learning and Autonomy

Features
Code

Code for VRGym will come soon.

Bibtex

@inproceedings{xie2019vrgym,
author={Xie, Xu and Liu, Hangxin and Zhang, Zhenliang and Qiu, Yuxing and Gao, Feng and Qi, Siyuan and Zhu, Yixin and Zhu, Song-Chun},
title={VRGym: A Virtual Testbed for Physical and Interactive AI},
booktitle={Proceedings of the ACM SIGAI},
year={2019}}