Is Tesla Responsible for the Deadly Crash On Auto-Pilot?


By Patrick Lin 

Tesla’s Autopilot had its first fatality, the company announced yesterday. Statistically, this was bound to happen. The self-driving car had broadsided a truck that its sensors didn’t detect, and the driver didn’t see it either.

Does Tesla have any responsibility for the accident, even if the driver was supposed to be watching the road at all times?

The argument for why the driver, not Tesla, was responsible is that the driver had agreed to always monitor the road, in case of emergency situations exactly like this that the car cannot handle. This is part of the company’s standard agreement before it allows customers to use the Autopilot feature, which is still in beta-testing mode since its introduction last October. (Beta-testing is working out the last bugs in a product before its official release to the public.)

Read the Full Article at >>>>


We See The World From All Sides and Want YOU To Be Fully Informed
In fact, intentional disinformation is a disgraceful scourge in media today. So to assuage any possible errant incorrect information posted herein, we strongly encourage you to seek corroboration from other non-VT sources before forming an educated opinion.

About VT - Policies & Disclosures - Comment Policy
Due to the nature of uncensored content posted by VT's fully independent international writers, VT cannot guarantee absolute validity. All content is owned by the author exclusively. Expressed opinions are NOT necessarily the views of VT, other authors, affiliates, advertisers, sponsors, partners, or technicians. Some content may be satirical in nature. All images are the full responsibility of the article author and NOT VT.
Previous articleTransgender S.A. veterans hope for more benefits
Next article16 veterans face fraud charges for VA travel vouchers