Meta’s Habitat 3.0 simulates real-world environments for intelligent AI robot training – SiliconANGLE

0
945


Researchers from Meta Platforms Inc.’s Fundamental Artificial Intelligence Research team said today they’re releasing a more advanced version of the AI simulation environment Habitat, which is used to teach robots how to interact with the physical world.

Along with the launch of Habitat 3.0, the company announced the release of the Habitat Synthetic Scenes Dataset, an artist-authored 3D dataset that can be used to train AI navigation agents, as well as HomeRobot, an affordable robot assistant hardware and software platform for use in both simulated and real world environments.

In a blog post, FAIR researchers explained that the new releases represent its ongoing progress into they like to call “embodied AI.” By that, they mean AI agents that can perceive and interact with their environment, share that environment safely with human partners, and communicate and assist those human partners in both the digital and the physical world.

Habitat is a catalog of virtual environments such as office spaces, homes and warehouses that can be used to train and refine AI-powered robots to navigate in the real world. The virtual environments within it are constructed with meticulous detail using an infrared capture system that goes as far as measuring the exact shape and size of objects such as tables, chairs and even books. Within these environments, researchers can train robots to complete complex, multistep tasks that require the ability to see and understand their surroundings.

Habitat 3.0 builds on those existing capabilities with support for both robot and humanoid avatars, enabling human-robot collaboration on many different tasks. For example, humans and robots can work together to clean up a living room or prepare a recipe in the kitchen. With this, FAIR says, it’s opening up new avenues for research into human-robot collaboration on a range of diverse, realistic tasks. The human avatars within Habitat 3.0 are said to be extremely realistic, with a natural gait and movements to enable the most realistic low- and high-level interactions, FAIR said.

“This cohabitation of humans and robots in the simulation environment allows us for the first time to learn robotics AI policies in the presence of humanoid avatars in home-like environments on everyday tasks and evaluate them with real humans-in-the-loop,” the researchers wrote.

FAIR said Habitat 3.0 will reduce the time it takes for robot AI agents to learn something from months or even years to just a few days. It will also enable far more rapid testing of new models in safe, simulated environments, without risking anything.

The Habitat Synthetic Scenes Dataset, called HSSD-200, will also help to accelerate embodied AI research, since the 3D simulations of real world scenes are critical for training. FAIR explained that HSSD-200 is superior to previous datasets it has made available because the 3D scenes mirror physical-world scenes much more accurately than before. It consists of 211 high-quality 3D scenes that replicate real-world houses and other settings, and contains a diverse set of 18,656 models of physical-world objects within 466 semantic categories.

According to FAIR, HSSD-200 offers fine-grained semantic categorization corresponding to WordNet ontology, while its asset compression allows higher performance embodied AI simulation. The individual objects were all created by professional 3D artists, accurately matching the appearance and size of furniture and appliances made by real-world brands.

Finally, FAIR introduced a new HomeRobot library, which is a hardware and software specification for researchers wanting to create a physical robot to put their Habitat-trained models to use in the physical world.

HomeRobot is based on a user-friendly software stack and affordable hardware components, meaning it can be set up quickly and easily and made ready for real world testing. Its designed specifically for Open-Vocabulary Mobile Manipulation research, which refers to robots that can pick an object in any unseen environment and place it in a specified location. To do this, robots must be able to perceive and understand new scenes they come across.

Holger Mueller of Constellation Research Inc. said Meta’s announcement shows real progress that goes beyond the generative AI hype, with powerful software that can be used to train and test intelligent robots in virtual worlds. “Habitat 3.0 is now focused on human-machine interaction, because this is a key milestone that needs to be perfected if we’re to build robots that can feature in our daily lives,” he said. “The HSSD-200 dataset is useful because generating the physical objects within these environments is expensive and takes a lot of time.”

FAIR said there’s a lot more to come from these developments. Its ongoing research into embodied AI will focus next on how robots can collaborate with humans in dynamic, constantly changing environments that reflect the real world we live in.

“In the next phase of our research, we’ll use the Habitat 3.0 simulator to train our AI models so these robots are able to assist their human partners and adapt to their preferences,” the researchers explained. “We’ll use HSSD-200 in conjunction with Habitat 3.0 to collect data on human-robot interaction and collaboration at scale so we can train more robust models. And we’ll focus on deploying the models learned in simulation into the physical world so we can better gauge their performance.”

Image: Meta Platforms

Your vote of support is important to us and it helps us keep the content FREE.

One-click below supports our mission to provide free, deep and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU



Source link