Start your day with intelligence. Get The OODA Daily Pulse.

Researchers Propose Neural Whole-Body Controller for Humanoid Robots

Researchers from NVIDIA, CMU, UC Berkeley, UT Austin, and UC San Diego presented HOVER, a “versatile neural whole-body controller for humanoid robots.” This multi-mode policy distillation framework allows the robots to move all their limbs with just one model. NVIDIA’s Senior Research Manager and Lead of Embodied AI Jim Fan revealed that the team trained a neural network with only 1.5 million parameters, which might sound like a lot, but some models have billions of those, so this is pretty impressive. HOVER can perform several high-level motion tasks. Fan named a few of those “control modes”:

  • Head and hand poses: can be captured by XR devices like Apple Vision Pro.
  • Whole-body poses: via MoCap or RGB camera.
  • Whole-body joint angles: exoskeleton.
  • Root velocity command: joysticks.

According to him, HOVER enables a unified interface to control a robot using “whichever input devices are convenient at hand,” an easier way to collect whole-body teleoperation data for training, and an upstream Vision-Language-Action model to provide motion instructions, which HOVER translates into low-level motor signals at high frequency.

Full report : NVIDIA Introduces HOVER, a 1.5 M Parameter Neural Network for Humanoid Robotics.

Tagged: NVIDIA Robotics