Facebook AI Introduces Habitat 2.0: Next-Generation Simulation Platform

Facebook recently announced Habitat 2.0, a next-generation simulation platform that lets AI researchers teach machines to navigate through photo-realistic 3D virtual environments and interact with objects just as they would in an actual kitchen or other commonly used space. With these tools at their disposal and without the need for expensive physical prototypes, future innovations can be tested before ever setting foot into reality!

One of the most powerful ways to train robots to accomplish useful tasks in the real world is to teach them in simulation. Exploring virtual worlds allows AI agents to practice a task thousands or even millions of times faster than they could in a real physical space. Today, we are announcing Habitat 2.0, a next-generation simulation platform that lets AI researchers teach machines not only to navigate through photo-realistic 3D virtual environments but also to interact with objects just as they would in an actual kitchen, dining room, or other commonly used space.

facebook

Habitat 2.0 could be one of the fastest publicly available simulators of its kind that employs a human-like experience for AI agents to perform. This makes it possible for them to interact with items, drawers, and doors quickly within an accelerated space or time according to their predetermined goals, which are usually related to robotics research, so they can learn how humans think to give instructions on what they should do next by mimicking our own actions as closely as possible!

With Habitat 2.0, AI researchers can now build virtual robots that are able to perform tasks like stocking the fridge or fetching objects with high reliability and without having to rely on static 3D simulations anymore.

Introducing ReplicaCAD

Habitat 2.0’s new data set, ReplicaCAD, is a Replica mirror, which Facebook Reality Lab released previously for 3D environments and has been rebuilt to support the movement and manipulation of objects. Previously static 3D scans have now been converted into individual models with physical parameters as well as collision proxies shapes that can be used in training modules on how to move or manipulate different objects while still being safe from accidents like running into walls when trying to walk through them too quickly

Habitat 2.0 Simulator: Faster speeds open up more research possibilities

Habitat 2.0 builds on and expands the capabilities of the Habitat-Sim simulation engine by supporting piecewise-rigid objects such as cabinets and drawers that can rotate on an axis or slide; articulated robots that include mobile manipulators like Fetch, fixed-base arms like Franka, and quadrupeds like AlienGo; and rigid-body physics (via Bullet physics engine).

facebook ai

In building Habitat 2.0, we prioritized speed/performance over a wider range of simulation capabilities because it will allow the research community to test new approaches and iterate more effectively. For instance, rather than simulating wheel-ground contact, we use a navigation mesh to move the robot. The platform also does not currently support non-rigid dynamics such as deformables, liquids, films, cloths, and ropes, as well as audio or tactile sensing. This streamlined focus makes the Habitat 2.0 simulator two orders of magnitude faster than most 3D simulators available to academics and industry professionals.

Home Assistant Benchmark: New milestones for home assistant training tasks

The ReplicaCAD dataset and Habitat 2.0 simulator make it possible to create a new library of household assistive tasks called Home Assistive Benchmark (HAB). Tasks in HAB can include general tasks like setting the table, cleaning the fridge, and cleaning the house; robot skills like navigation, pick, place, open cabinet drawer, open fridge door, and others; and agent configuration around common household errands. HAB requires that robots not assume any prior knowledge about the environment (to allow it to deal with new environments and radical changes in known environments) and operate exclusively from onboard sensors such as RGBD cameras, egomotion and joint-position sensors.

In the future, Habitat will seek to model living spaces in more places around the world, enabling more varied training that takes into account cultural- and region-specific layouts of furniture, types of furniture, and types of objects. We acknowledge these representational challenges and are working to improve the diversity and geographic inclusion of the 3D environments currently available for research. Also, while Habitat 2.0 is a fast simulator, we are working to speed it up even more by addressing potential bottlenecks, such as its handling of synchronized parallel environments and need to reload assets when an episode resets. Holistically reorganizing the rendering+physics+reinforcement learning interplay would be an exciting direction for future work.

  • FacebookAI

+++