News BriefsTechnology

Researchers Trick Tesla to Drive into Oncoming Traffic

A new study by Tencent’s Keen Security Lab underscores that recent warnings by artificial intelligence (AI) experts about the risks of adversarial machine learning are more than justified.

After studying how the Enhanced Autopilot driver-assistance system used in Tesla vehicles reads and processes environmental data in order to determine when it needs to change lanes, researchers were able to trick the Autosteer feature merely by painting interference patches on the road. This means that threat actors could get a Tesla car to drive into oncoming traffic without needing to hack into the vehicle.

Read more: Researchers Trick Tesla to Drive into Oncoming Traffic

OODA Analyst

OODA Analyst

OODA is comprised of a unique team of international experts capable of providing advanced intelligence and analysis, strategy and planning support, risk and threat management, training, decision support, crisis response, and security services to global corporations and governments.