Tesla plans to make its Autopilot Software more restrictive

The well known and discussed company Tesla is again up to something new or a change. Remember when Tesla activated its Autopilot software last year, it didn’t take long before Model S owners began pushing its software to absurd limits.

As per today’s trend, the videos of Model S owners engaging in downright dangerous behaviors started uploading. For example, a driver deciding to have his breakfast while his Model S was about to go down a highway at 90 MPH to an individual who turned on Autopilot and then decided to go and sit on the back seat.

tesla-motors-autopilot

Similar activities became popular for entertainment and viral videos were on a completely new level. But while you becoming popular and playing around, it is actually a 5,000 pound car that can lead to untold amounts of damage. Such behavior may create a huge problem for people in and around the 5,000 pound toy. This may even turn out to be fatal.

As a result of this, Tesla said that it would implement measures to limit some of the ridiculous activities which Model S owners’ were engaged with.

“There’s been some fairly crazy videos on YouTube,” Tesla CEO Elon Musk said during a conference call a few months ago. “This is not good. And we will be putting some additional constraints on when autopilot can be activated to minimize the possibility of people doing crazy things with it.”

Tesla finally ended up restricting the use-cases for the Autopilot. It now comes with a word that the company plans to implement more safeguards with the software which will further control people from doing crazy harmful stuff.

New measures using Autopilot mode are reportedly the result of recent Tesla Model S accidents. These occurred when drivers were misusing the Autopilot feature. For example, a recent Tesla accident was noted in China. The driver was busy retrieving a cloth from the glove compartment and wiping his dashboard. In another accident that occurred in Montana, Tesla discovered that a Model X driver ignored repeated warnings to assume control over the car.

It is thus expected and assumed that the next-gen Autopilot software will have more constraints. This will include constraints regarding when and how drivers will be able to use the feature. Currently, Autosteer has been designed to turn off when the driver fails to respond after 15 seconds of “visual warnings and audible tones.” With the upcoming update, these restrictions might become tighter.

Regardless the update, safety, and security of people in and around the car should be the primary objective.