Victims of sexual assault, rape, harassment, and gender-motivated violence criticized Uber’s arbitration clause

Fourteen victims of sexual assault, rape, harassment, and gender-motivated violence criticized Uber’s arbitration clause, which prevented them from bringing lawsuits about the harm they suffered. Their letter to Uber’s Board of Directors asked that Uber remove (or agree not to enforce) its arbitration clause as to these complaints. They noted a California case in which Uber aggressively sought to force one of their complaints into confidential arbitration. They also noted pending legislation in the United States Congress and New York State Senate that would disallow companies from requiring victims of sexual harassment or assault to proceed in arbitration.

News coverage from The Mercury News and Recode.

Uber backup drivers fell short in safety functions

CityLab reported widespread shortcomings of the backup drivers who were responsible for supervising Uber’s self-driving cars. One, it is unclear whether humans can do a good job supervising machines that work well most of the time — requiring intense concentration to identify the occasional error, when most of the time, there are tempting distractions. Uber’s 8 to 10-hour shifts, with one 30 minute lunch break, were grueling — and drivers were often assigned to repeat the same driving “loops” which likely made the task particularly dull for drivers. Additional challenges included working entirely alone (without other humans) (after Uber removed a second staff person from each vehicle), and, CityLab reported, the vehicles’ frequent hard braking.

Meanwhile, CityLab spoke with multiple drivers who were dismissed from Uber for safety infractions, including using a phone while a vehicle was in motion — undermining any suggestion that all safety drivers do as instructed.

Removed second staff person from autonomous cars

Historically, Uber’s autonomous cars had two staff members onboard: One to take over driving in case of problems, and another to monitor onboard systems to track performance and label data. But Uber later moved to a single operator. Reviewing 100 pages of internal company documents, the New York Times reported that some employees expressed safety concerns about the change. Among other concerns, they noted that solo work would make it harder to remain alert during monotonous driving.

Broadly, problems seemed to have unfolded as internal critics worried. One Uber autonomous car safety driver was fired after being seen asleep at the wheel. When an Uber vehicle struck and killed a pedestrian in Tempe, Arizona, early review of the onboard video shows the staff person looking down or sideways, perhaps at a phone or onboard systems, but not at the road.

Self-driving cars fell short of expectations

Reviewing 100 pages of internal company documents, the New York Times reported that Uber vehicles were falling short of company objectives. For example, Google cars could drive an average of nearly 5,600 miles before a driver had to take control from the computer, whereas Uber vehicles struggled to meet the company’s target of one intervention every 13 miles.

Self-driving vehicle struck and killed pedestrian

An Uber self-driving vehicle struck and killed a pedestrian in Tempe, Arizona.

Early reports indicated that the pedestrian was crossing a roadway after dark, outside a crosswalk, and that Uber would probably be deemed not at fault in this incident.

But reviewing the crash video, multiple concerns arose. For one, Uber’s onboard driver — responsible for taking over in case of system problems — was looking down or sideways, hence unable to see the pedestrian. If her hands were on the steering wheel, ready to take over driving from the computer, that is not apparent from the video. Two, the pedestrian was making steady progress across the roadway. Three, some experts said a standard automatic emergency braking system, even on ordinary commercial vehicles, would have been able to detect the pedestrian and at least apply the brakes.

Velodyne, which makes the LIDAR sensors used on Uber’s autonomous cars, expressed surprise that the Uber vehicle hit the pedestrian. A Velodyne spokesperson explained in an email: “We are as baffled as anyone else. … Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation.” Velodyne suggested that Uber’s software might be at fault, explaining that “[o]ur Lidar doesn’t make the decision to put on the brakes or get out of her way” and that Uber’s systems would need to make those decisions.

Banned in Delhi after driver allegedly raped passenger

Uber was temporarily banned in Delhi, India in December 2014 after a driver allegedly took a passenger to a secluded area and raped her.  The decision followed mounting accusations that the company had failed to conduct proper background checks on drivers.

Mike Isaac’s Super Pumped (p. 188) presents the incident in greater detail: The driver noticed that the passenger had fallen asleep, and raped her in the back seat of his vehicle. Afterwards, he threatened to murder her if she told the police.