Tesla is the only carmaker beta testing 'autopilot' tech, and that's a problem

Https%3a%2f%2fblueprint-api-production.s3.amazonaws.com%2fuploads%2fcard%2fimage%2f139637%2f9d59eedea7034c24a6c61794da6fab2b
The recent news of the first-known fatality involving Tesla's semi-autonomousAutopilot system is undeniably tragic. However, it also highlights one distinct difference between the upstart carmaker and the rest of the established automotive industry: Tesla is beta testing the tech on the public.
Every other automaker — from BMW to Cadillac — flat-out refuses to beta test driverless tech on the public. Other semi-autonomous driving systems, like Volvo’s Pilot Assist system and Mercedes' Drive Pilot are in public hands, but they’re not beta tests. More on those later.
For those unaware of the concept of beta testing, it's the process of testing something, like software, that's not completely finished yet. The first step, alpha testing, removes the huge, glaring issues. Beta testing further hones the product.
Beta testing is the norm in the tech industry, especially for apps. It's a way to get a product into the hands of customers and let them discover the product's flaws. Apple's Siri is a well-known product honed with beta testing. This process allows developers to address known issues as they arise and remedy them with updates, etc.
Applying this methodology, which is perfectly acceptable for an app for your smartphone, to a 4,000-pound car, is problematic. When a beta-test app crashes, you might be annoyed or lose some data. If a beta-test software on your car fails,you could quite literally crash.
So why is Tesla doing it?

Click the box, trust the car

When the news of the May 7 fatal crash broke last week, Tesla responded to it with a blog post. In it, the company stressed that Autopilot cannot be enabled without the driver checking an acknowledgment box that warns them the system is in beta and that the driver needs to "maintain control and responsibility" for their vehicle.
This, though, is far from satisfactory, when it comes to ensuring the safety of the Tesla driver as well as everyone else on the road. That's because Autopilot is an advanced driver assist system, but it doesn't have the same precautionary backups that a fully autonomous car would have.
As I discussed in the initial aftermath of the crash, in the world of autonomous driving, there are levels: Level 0 is traditional driving, and Level 4 is full autonomy. Autopilot is Level 2, though, to many, it feels like Level 4. By that I mean, the system feels so in control that a driver could be lulled into a false sense of security and become distracted, which appears to be what happened in the fatal crash.
I recently reviewed the new Mercedes Drive Pilot system, which is ostensibly similar to Autopilot. It can accelerate, brake, steer and autonomously change lanes. However, from the driver's seat, Drive Pilot feels less robust than Autopilot. That's because it constantly requires more driver intervention than Autopilot does
Put simply, Drive Pilot is Level 2, and acts like it. Autopilot does not.
Mercedes' Drive Pilot could do more. In fact, it's the first production car to receive an autonomous driving license in Nevada. But Mercedes deliberately retarded its capabilities in the cars it's selling because it wanted to drivers to know they were ultimately responsible for what the car was doing.
By contrast, Tesla's Autopilot is riding on the cutting edge. In the hands of the public, it's doing the most a Level 2 system possibly can. Drive Pilot is doing nearly the least.
Here's where we get to the crux of the acknowledgement box problem. Simply having a warning about the system being in beta testing doesn't do enough to impart the gravity of a system failure. Does it warn drivers that a system malfunction — or, as what appears to be what happened in the Florida crash, a situation the system can't handle — may result in death or serious injury?

Thinking like a plane

In an email to Mashable following the report of the crash, Tesla CEO Elon Musk likened Autopilot to auto pilot on airplanes: "They reduce pilot workload and increase safety, but should not be relied upon without human oversight. That's why we used the same name."
That ignores the fact that the Federal Aviation Administration requires student pilots to have a minimum of 40 hours of flight time, 20 hours of which must be flown with an instructor. Not to mention commercial pilots have copilots to help them fly safely. By comparison, the average Department of Motor Vehicles driving test takes about 20 minutes.
Granted, comparing flying a 737 is not wholly akin to driving a luxury sedan. So let's contrast Tesla's beta testing to another carmaker's safety tech development policies.
Audi, one of the pioneering driverless tech brands, has been developing semi- and fully autonomous systems for over 11 years. One of its representatives emphatically told Mashable that Audi has never once — nor will it ever — beta test on the public. It tests everything thoroughly with engineers on both public and private roads. Moreover, every one of its safety systems is fully certified before its implemented into a production car.
In addition, Audi won't let just any of its employees drive its autonomous test cars. The engineers allowed to test the systems are first licensed by Audi in evasive driving. What's more, not every Audi engineer passes the rigorous test — it's that difficult.
Every other automaker I spoke to has similarly stringent guidelines for its employees charged with testing autonomous technology. None of them would dare release any kind of hardware or software into the public that wasn't fully and completely tested.

Legislation poor, hubris rich

Right now, there aren't any federal laws pertaining to the testing of driverless car tech. That's because regulators fear that a federal law governing the tech could have unintended negative consequences on the development of these safety systems.
Despite the absence of any law barring them from releasing unfinished software, every automaker but Tesla refuses to do so.
A Tesla representative told Mashable that the company puts "over millions of miles" on development vehicles before any vehicle or driving feature is released to the public. Once in the hands of the public, Tesla continues to "closely monitor and refine it over time."
Arguably, this sounds reasonable and thorough. So why is Tesla's Autopilot the only semi-autonomous system currently under investigation by the National Highway Traffic Safety Administration?
I believe the answer is threefold. It's one part branding, one part cost and another part hubris.
Simply, testing these systems is a huge undertaking. It requires large sums of money, years of development and thousands of man hours. Looking at Tesla's quarterly reports and stature as a small, upstart automaker, it's easy to assume that it has neither the manpower nor cash to develop and test semi-autonomous systems as extensively as other carmakers. It's worth pointing out here that Audi started working on self-driving cars a year after Tesla was founded — and four years before the Tesla's first car, the Roadster.
Secondly, Tesla emphasized in an email to Mashable that it does public beta testing to communicate to customers its commitment "to ongoing improvements to the feature over time." If Tesla wants to appear to be on the cutting edge, it chooses to implement tech faster than another carmaker might.
This leads me to last and most worrying part: hubris. This is the word the company actually used to describe its own attitude surrounding the development and production of the its third vehicle, the Model X crossover SUV.
The Model X has many one-of-a-kind features, but Tesla soon found itself in over its head with now-problem-riddled falcon-wing doors or parts shortages. Like I pointed out not long after the car's debut, Tesla didn't put anything in the Model X that any other automaker could have, but those companies have traditionally rejected such features precisely because of the production problems Tesla has encountered. But Tesla ran with them because it's a fledgling company that didn't know any better and presumably wanted to appear futuristic.
This hubristic attitude, in addition to branding and cost, has clearly informed the choice to publicly beta-test Autopilot. Any other automaker could do that, too, but they don't. This past week has made it abundantly clear why.
Tesla is the only carmaker beta testing 'autopilot' tech, and that's a problem Tesla is the only carmaker beta testing 'autopilot' tech, and that's a problem Reviewed by Unknown on 14:04:00 Rating: 5

No comments:

Powered by Blogger.