A Silicon Valley executive has taken Tesla Motors to task after viewing a video of a Model S using the Autopilot system bumping into the back of a work truck.

Andrew Ng, chief scientist of Chinese web service Baidu and co-founder of Google’s Deep Learning project, posted a Twitter comment about it after viewing a CNET article. The article shows a video and Ng’s tweet reported on Electrek. Ng wrote that Tesla is being “irresponsible to ship driving system that works 1,000 times and lulls false sense of safety, then… BAM!”

The short video shows a Model S driving on a highway in Switzerland with Tesla’s Adaptive Cruise Control engaged. The Autopilot ignores flashing brake lights from the truck and plows into the back end at a low speed.

Tesla Autopilot hits back of truck

Tesla said that the Autopilot feature “does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said in an e-mail to Autoblog. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”

Two other Tesla drivers have recently said that the Autopilot system has been responsible for recent collisions they’ve had in Utah and California. The Utah accident involved a Tesla that rolled into a parked trailer after the driver said that the car’s Summon feature was engaged. The California incident involved a Tesla driver rear-ending another vehicle on Interstate 5. No one was injured in either the Utah or California collisions.

Tesla CEO Elon Musk said this spring that the car’s Autopilot system can reduce the probability of an accident by 50 percent.