Start your day with intelligence. Get The OODA Daily Pulse.
Over the past three years, Péter Fankhauser’s industrial robots went from being able to climb stairs, to jumping between boxes, doing backflips and performing other parkour-style tricks. The robots were not programmed to perform these new actions, instead adapting to their environment powered by new artificial intelligence models. “These are the moments where you think this is the next revolution,” said Fankhauser, chief executive of ANYbotics, a Zurich-based robotics start-up. “These things started to move really artistically, and it’s almost scary because the robots play with physics.” Over the past decade, the $74bn robotics sector has accelerated in capabilities due to significant leaps in AI, such as advances in neural networks, systems that mimic the human brain. The world’s biggest tech and AI companies, from Google, OpenAI and Tesla, are among those racing to build the AI “brain” that can autonomously operate robotics in moves that could transform industries from manufacturing to healthcare. In particular, improved computer vision and spatial reasoning capabilities have allowed robots to gain greater autonomy while navigating varied environments, from construction sites to oil rigs and city roads. Training and programming robots previously required engineers to hardwire rules and instructions that taught the machine how to behave, often specific to each system or environment. The advent of deep learning models in recent years has enabled experts to train AI software that allows machines to be far more adaptive and reactive to unexpected physical challenges in the real world and learn by themselves.