Categories
EV News

Elon Musk pushes boundaries in Tesla autonomous vehicle campaign

Tesla customers’ videos indicate unpredictable functioning with the latest update to Tesla’s driver-assistance system, dubbed “FSD Beta.”

  • Tesla has previously issued a recall for 54,000 FSD Beta vehicles to disable a function that allowed the cars to go past a stop sign without fully stopping in certain circumstances.

Elon Musk, Tesla’s CEO, frequently proclaims the coming of fully autonomous vehicles as near, but it’s unclear how close that future is for the electric automaker.

Meanwhile, the business is introducing new capabilities in a regulatory climate in the United States that has frequently taken a hands-off attitude to emerging technologies, using names like Full Self Driving (FSD) that opponents believe are misleading.

Tesla users have shared videos online showing unpredictable functioning in “FSD Beta,” the most recent update to Tesla’s driver-assistance system.

Cars may be observed making strange turns, crashing into safety cones, and lurching unexpectedly.

Tesla issued a recall for 54,000 vehicles equipped with FSD Beta earlier this month to disable a technology that allowed the cars to go through a stop sign without fully stopping in some circumstances.

The incident reveals a drawback to Musk’s risk-taking strategy, which has been credited with making electric vehicles a mainstream choice in the US and other markets.

“The rolling stop recall was not caused by an honest engineering error, but rather by a choice Tesla claims was taken intentionally to breach traffic laws,” said Phil Koopman, an autonomous vehicle expert at Carnegie Mellon University.

Following a number of crashes with first-responder vehicles involving Teslas equipped with its “Autopilot” driver-assistance technology, the National Highway Traffic Safety Administration (NHTSA) initiated an investigation last year.

New automobiles are not systematically certified by safety inspectors before they hit the market in the United States, according to US regulations. Instead, automakers must certify that their goods adhere to the rules.

The NHTSA only intervenes if a vehicle has a fault that raises questions about its compliance or if it is deemed hazardous.

According to Bryant Walker Smith, a law and mobility expert connected with Stanford Law School, authorities may not have any laws governing technologies like adaptive cruise control in some circumstances.

During Donald Trump’s administration, the National Highway Traffic Safety Administration (NHTSA) avoided taking actions that would have hindered the development of driverless technology.

However, after President Joe Biden took office, the NHTSA began to examine the safety issues around driver-assistance programmes more seriously.