A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.
You must log in or # to comment.
TL;DR: faking out a self-driving system is always going to be possible, and so is faking out humans. But doing so is basically attempted murder, which is why the existence of an exploit like this is not interesting or new. You could also cut the brake lines or rig a bomb to it.