Uber: Self-driving cars ordered off road by US, sells to Grab

Uber: Self-driving cars ordered off road by US, sells to Grab

Uber’s self-driving cars have been ordered off the road in Arizona.

State Governor Doug Ducey has withdrawn permission for the company to test autonomous vehicles on public roads in Arizona, one week after the accident that killed 49-year-old pedestrian, Elaine Herzberg.

It has been reported in the US that Herzberg was, or had been, homeless.

“Improving public safety has always been the emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in the state of Arizona,” Ducey wrote in a letter to Uber CEO, Dara Khosrowshahi.

“The incident that took place on March 18 is an unquestionable failure to comply with this expectation.”

While there is no news of similar moves by other states, any preliminary finding that Uber’s technology is at fault would inevitably lead to national suspension.

Change of heart?

After the incident, Arizona police chief Sylvia Moir said the accident may have been unavoidable. However, this latest move by the Arizona authorities may suggest that they now believe Uber’s technology was at fault.

Arizona police issued a video of the incident last week, which shows that the car failed to slow down or take evasive action to avoid hitting Herzberg, who was crossing the road with a bicycle.

Some analysts now believe that, while the car’s onboard cameras might have been impaired by the low light conditions, its LiDAR (Light Detection and Ranging) system should have had time to register Herzberg’s presence. The fact that this didn’t happen may suggest that the laser-based system was switched off, malfunctioning, or ineffective.

The Tesla angle

Writing on investors’ forum seekingalpha.com, analysts EnerTuition suggested that, alongside Uber, investors should be wary of Tesla technology for this reason: according to the analyst, Tesla’s autonomous vehicles lack a LiDAR system.

“A pedestrian walking the bike proceeds through a high-speed roadway in darkness who goes unnoticed by Uber’s Vision system, arguably, until it is too late. This is one of the fundamental challenges of a Vision centric system, like the one used by Tesla.

“The value of the Vision system deteriorates under low light scenarios. LiDAR, on the other hand, is a complementary system that would work well in low light situations. There is a reason that almost all autonomous companies, with the notable exception of Tesla, use LiDAR as part of sensor suites.”

The video also reveals that Uber’s safety driver was looking down, and not at the road, and so didn’t see Herzberg until immediately before impact. The absent, distracted, or delayed responses of human drivers in autonomous or semi-autonomous vehicles may prove to be a critical problem in these tests.

“In our view, it is foolish to expect that people can maintain a required level of alertness when autonomy seems to be working. Even partially working autonomy lulls people in to a false sense of security and increases the response time.

“Google has long argued that man-machine hand-off is a problem and has opted to move directly to implementing a Level-4 [autonomy] system,” said EnerTuition.

Legal quagmire

Arizona police’s Moir said that she did not rule out taking action against the safety driver. However, this could create a legal quagmire, given that the car is known to have been operating autonomously.

If humans onboard autonomous or semi-autonomous vehicles are required to be 100 percent focused on the road, then this would challenge the very concept of autonomy.

The incident in Tempe, Arizona, was first fatal accident involving a driverless vehicle and a pedestrian. Uber had been testing its self-driving Volvos on opens roads in the state, along with three other locations, including San Francisco.

Uber voluntarily suspended its tests after the accident, as did Toyota. However, other companies’ self-driving experiments have continued unchanged.

Plus: Grab take Uber’s South East Asia business

In related news, Uber has sold its ride-share and delivery business in South East Asia to local rival Grab, the most popular service in the region with millions of local customers.

The terms of the deal have not been disclosed. However, the move does not signal a total retreat from Asia, following the 2016’s sale of Uber’s China business to Didi Chuxing. Instead, Uber will take a 27.5 percent stake in Grab, and CEO Khosrowshahi joins the company’s board.

In an email to Uber staff, Khosrowshahi said:

“One of the potential dangers of our global strategy is that we take on too many battles across too many fronts and with too many competitors. This transaction now puts us in a position to compete with real focus and weight in the core markets where we operate, while giving us valuable and growing equity stakes in a number of big and important markets where we don’t.

“While M&A will always be an important value-creation tool for our company, going forward we will be focused on organic growth – growth that comes from building the best products, services, and technology in the world, and re-building our brand into the mobility brand that riders, cities, and drivers want to support and partner with.

“Onward.”

• No apparent mention has been made on Uber’s website of the incident in Arizona, or of the latest move by state authorities to prevent further testing.

Internet of Business says

While it’s important not to be alarmist about one tragic death in Arizona – given the 1.2 million people who die on roads worldwide every year – it’s also important to face the fact that an AI system appears to have killed a human being, or at least not attempted to avoid doing so.

And as our earlier report explained, there are very few fully autonomous vehicles on the road compared with the 1.2 billion normal cars in use today, so this accident should be examined with great care.

Whatever Arizona chooses to do next, one thing should be apparent: the legal situation is highly problematic, but the law will demand that someone takes responsibility or is deemed liable.

There are two key questions. One: Can a safety driver be held responsible if an autonomous system is operating, but fails to detect a pedestrian?

If so, this would appear to suggest that the person onboard is deemed to have superior judgement and faculties. If that is the case, then autonomous vehicles cannot be regarded as safer at present. It may also deter people from wanting to be safety drivers, if they believe they may become legal scapegoats.

And two, if the technology itself is at fault, then who is responsible? Autonomous systems are a complex mix of sensors, cameras, software, positioning systems, and more. If one component fails or is inadequate, then establishing legal liability will be far from simple.

Tempe may prove to be a watershed moment, not just in the law regarding autonomous vehicles, but also in establishing legal precedent for what happens if or when AI takes a human life.

Read more: Toyota halts autonomous car tests after Uber accident

Read more: New Baidu, Jaguar Land Rover driverless cars take to the road

Read more: AI regulation & ethics: How to build more human-focused AI