Uber=aggressive "Uber’s culture is a bad fit for building a driverless car"

春风吹

桃花仙
注册
2017-02-24
消息
3,248
荣誉分数
1,448
声望点数
223
在没有十足把握的时候,不遵守限速,雇佣有犯罪记录的人做backup driver。没有别的几个大公司的研发实力,而偏偏选择选择自行开发。。。。
都是Uber的特点。猛险狂。

Video suggests huge problems with Uber’s driverless car program

Video of the fatal driverless car crash in Tempe is damning for Uber.
TIMOTHY B. LEE - 3/22/2018, 4:36 PM

uber-cracked-windshield-800x450.jpg

Enlarge
Aurich Lawson / Everything Is In Stock
368
There's something very wrong with Uber's driverless car program.

On Wednesday night, police released footage of Sunday night's deadly car crash in Tempe, Arizona, where an Uber self-driving car crashed into 49-year-old Elaine Herzberg. The details it reveals are damning for Uber.

"The idea that she 'just stepped out' or 'came out in a flash' into the car path is clearly false," said Tara Goddard, an urban planning professor at Texas A&M University, after seeing the video. "It seems like the system should have responded."

FURTHER READING
Video: Uber driver looks down for seconds before fatal crash
The video shows that Herzberg crossed several lanes of traffic before reaching the lane where the Uber car was driving. You can debate whether a human driver should have been able to stop in time. But what's clear is that the vehicle's lidar and radar sensors—which don't depend on ambient light and had an unobstructed view—should have spotted her in time to stop.

On top of that, the video shows that Uber's "safety driver" was looking down at her lap for nearly five seconds just before the crash. This suggests that Uber was not doing a good job of supervising its safety drivers to make sure they actually do their jobs. The combination of these failures—and Herzberg's decision to jaywalk in the first place—led to her death.

But zooming out from the specifics of Herzberg's crash, the more fundamental point is this: conventional car crashes killed 37,461 in the United States in 2016, which works out to 1.18 deaths per 100 million miles driven. Uber announced that it had driven 2 million miles by December 2017 and is probably up to around 3 million miles today. If you do the math, that means that Uber's cars have killed people at roughly 25 times the rate of a typical human-driven car in the United States.

Of course, with a sample size of one, this doesn't tell us very much, statistically speaking, about the safety of Uber's vehicles in the future. It's possible that Uber just got exceptionally unlucky. But it seems more likely that, even with the safety driver, Uber's self-driving cars are way more dangerous than a car driven by the average human driver.

This shouldn't surprise us. Uber executives know they're behind Waymo in developing a self-driving car, and they've been pulling out all the stops to catch up. Uber inherited a culture of rule-breaking and corner-cutting from its founder and former CEO Travis Kalanick. That combination made a tragedy like this almost inevitable.

Uber probably wasn’t at fault, legally speaking, in recent crashes

Enlarge
/ An Uber self-driving car in San Francisco in 2017.
Justin Sullivan/Getty Images
Consider these recent crashes involving self-driving Uber cars:

In March 2017, an Uber self-driving car was struck on the left side as it went through an intersection in Tempe, Arizona. Uber was in the right-most lane on a six-lane road, approaching an intersection. The other two lanes in Uber's direction were backed up with traffic. The other car was traveling in the opposite direction and making a left turn. The driver of that other vehicle said that cars stopped in the other lanes blocked her view, preventing her from seeing the Uber vehicle.

"Right as I got to the middle lane about to cross the third, I saw a car flying through the intersection, but I couldn't brake fast enough to completely avoid the collision," the driver of the non-Uber car said in the police report. Police cited the non-Uber driver for failing to yield the right of way. The Uber driver was not cited.

In February 2018, an Uber vehicle in Pittsburgh collided with another vehicle after the other car made a left turn in front of it. The Uber vehicle had its turn signal on, and the other driver thought this meant the Uber vehicle was going to turn at the intersection rather than go straight through. Uber says the car had its turn signal on because it was planning to change lanes.

Police did not determine who was at fault in the accident. But a Pennsylvania attorney told Ars that "generally speaking, before you take a left-hand turn, you're required to ensure there's not traffic coming from the other direction."

FURTHER READING
Police chief said Uber victim “came from the shadows”—don’t believe it
In March 2018, we had this Sunday's deadly crash in Tempe. Authorities have not reached any final conclusions about the case, but experts have told Ars there's good reason to believe Herzberg may have been at fault, legally speaking. She was jaywalking in the middle of the night outside of a marked crosswalk.

"I think that preliminary results coming out is that the automation of the car was not at fault because the pedestrian stepped into the road," said Mohamed Abdel-Aty, a civil engineer and traffic safety expert at the University of Central Florida.

So in all three of these incidents, there's a strong argument that the other people involved—not the Uber car—were legally at fault for the crashes.

That doesn’t mean Uber’s cars are driving well

Enlarge
/ Jessica McLemore took this picture of the damage to her car shortly after a crash with an Uber vehicle in Pittsburgh in February 2018.
Jessica McLemore
"One of my big concerns about this incident is that people are going to conflate an on-the-spot binary assignment of fault with a broader evaluation of the performance of the automated driving system, the safety driver, and Uber's testing program generally," said Bryant Walker Smith, a law professor at the University of South Carolina.

"Human drivers recognize that they are going to deal with all kinds of behaviors that are not exactly lawful," he added. "An obligation imposed under most if not all state vehicle codes are that drivers shall take due care to avoid a collision. You never get to say, well of course I hit them, they were in the road in my way."

FURTHER READING
New report highlights limitations of Cruise self-driving cars
Indeed, it's entirely possible to imagine a self-driving car system that always follows the letter of the law—and hence never does anything that would lead to legal finding of fault—but is nevertheless way more dangerous than the average human driver. Indeed, such a system might behave a lot like Uber's cars do today.

For example, in that March 2017 collision in Tempe, the Uber driver reported that he was traveling 38 miles per hour (61 km/h) at the time of the crash—just shy of the 40-mile-per-hour speed limit.

"As I entered the intersection, I saw the vehicle turning left," he wrote. "There was no time to react as there was a blind spot created by the line of southbound traffic."

The Uber car may have had a legal right to zip past two lanes of stopped cars at 38 miles per hour (61 km/h). But a prudent driver could have anticipated the possibility of a car in the blind spot—or, for that matter, a pedestrian trying to dart between the stopped cars in the next lane—and slowed down to 30 or even 20 miles per hour.

So, too, in Pittsburgh. There was no police report, so we don't know how fast the Uber car was traveling or if it tried to stop. But a prudent driver approaching an intersection where an oncoming car has a left turn signal on will slow down a bit and be prepared to stop—just in case the other car decides to turn illegally.

As for this month's crash in Tempe, there seems to be little doubt that there was a serious failure of the car's technology. It may or may not have been feasible for the car to stop based on camera data. But lidar works just as well at night as it does in the daytime. Even if Uber's software didn't know Herzberg and her bicycle was a person, a car should always slow down if it sees an object that big moving into its lane.

Moreover, if the car really couldn't have stopped in time to avoid killing Herzberg, that seems like a sign that the car was driving too quickly. It's not like she jumped out from behind some bushes.


Enlarge
/ Anthony Levandowski, then VP of engineering at Uber, speaking to reporters at the Uber Advanced Technologies Center on September 13, 2016 in Pittsburgh, Pennsylvania.
ANGELO MERENDINO/AFP/Getty Images
The fundamental issue here is that automotive safety is all about redundancy. Good drivers anticipate the possibility that the other driver—as well as pedestrians and other road users—might make a mistake and take appropriate precautions. That way, even if one driver makes a mistake—which drivers inevitably do—a crash can be prevented.

Good highway engineers follow the same philosophy. They install guard rails to save drivers who go off the freeway. They add speed bumps to roads where speeding endangers pedestrians. They redesign intersections where crashes persistently occur. From the engineer's perspective, it doesn't matter who was at fault in any particular crash—if a road design leads to people getting hurt regularly, it's a bad design.

The same principle applies to the design and testing of driverless cars. Driverless cars use multiple categories of sensors—cameras and lidar and radar—so that the failure of any one type of sensor can't create a dangerous situation. Waymo and GM's Cruise, Uber's leading competitors in the driverless car market, both say that their cars have redundant computers as well as redundant braking, steering, and power systems.

FURTHER READING
Why experts believe cheaper, better lidar is right around the corner
And, of course, safety drivers are the ultimate form of redundancy in driverless cars. If they do their jobs well, safety drivers should be able to ensure that self-driving cars are at least as safe as a human driver during testing—since the driver will take over if the car makes a mistake.

By contrast, court records unearthed by the recent Uber/Waymo lawsuit showed that a key architect of Uber’s driverless car technology was contemptuous of this approach to driverless car safety.

"We don't need redundant brakes & steering or a fancy new car; we need better software," wrote engineer Anthony Levandowski to Alphabet CEO Larry Page in January 2016. "To get to that better software faster we should deploy the first 1000 cars asap. I don't understand why we are not doing that. Part of our team seems to be afraid to ship." In another email, he wrote that "the team is not moving fast enough due to a combination of risk aversion and lack of urgency."

Of course, when you're building a technology that could get people killed, "risk aversion" isn't necessarily a bad thing.

Uber’s culture is a bad fit for building a driverless car

Enlarge
/ Travis Kalanick speaks at an event in New Delhi on December 16, 2016.
MONEY SHARMA/AFP/Getty Images
Shortly afterward, Levandowski left to create a startup, Otto, that was quickly acquired by Uber and became the core of Uber's driverless car project. As the Verge's Sarah Jeong points out, then-Uber CEO Travis Kalanick shared Levandowski's win-at-any-costs philosophy.

"I just see this as a race and we need to win, second place is first looser [sic]," Levandowski wrote in a text message to Kalanick.

Kalanick was making similar arguments at the time.

"If we are not tied for first, then the person who is in first, or the entity that's in first, then rolls out a ride-sharing network that is far cheaper or far higher-quality than Uber's, then Uber is no longer a thing," Kalanick told Business Insider in 2016.

Later that year, an Uber vehicle was seen barreling through a red light in San Francisco. Uber initially claimed that the vehicle was not in driverless mode at the time, but a subsequent report from The New York Times indicated that this was untrue.

"All told, the mapping programs used by Uber's cars failed to recognize six traffic lights in the San Francisco area," the Times found, citing internal Uber documents.

Shortly afterward, Uber left the state of California in the midst of rising tensions with California regulators. The company moved its testing operations to Arizona, which is known for having one of the country's most permissive testing regimes.

Uber has since returned to California roads, but a large share of its testing activities now take place in the Phoenix area. One consequence of that shift is that the public doesn't know very much about how the vehicles are performing.

California law requires driverless car makers to report all collisions to state authorities and to file an annual report detailing the number of "disengagements," when a human driver needs to take over the vehicle. But Arizona has no such requirement, making it difficult for state officials or the general public to know how well Uber's cars are performing in the state.

FURTHER READING
Report: GM and Waymo lead driverless car race; Tesla lags far behind
Levandowski and Kalanick both left Uber last year. The company is now led by a CEO, Dara Khosrowshahi, and driverless car executives who hopefully don't share Levandowski and Kalanick's win-at-any-cost philosophy.

But company cultures tend to be sticky. Over time, teams develop shared understandings about what values are important and what behaviors are unacceptable. A founder sets the tone for the entire company, and that influence can persist long after the founder himself has departed.

And nothing Uber has done with its driverless car program since Kalanick's departure last summer indicates that Uber has changed its culture in a serious way. Uber is widely seen as a technology laggard among industry insiders, yet Uber recently placed an order for 24,000 Volvo cars that will be modified for driverless operation. Delivery is scheduled to start next year.

This rollout plan is more ambitious than anything industry leader Waymo has announced and more ambitious than any of Uber's other competitors have announced with the possible exception of GM's Cruise. It puts a lot of financial pressure on Uber—and Uber's driverless car engineers—to have their software and sensors ready in time. It will be a financial disaster for Uber if the company has to take delivery of thousands of vehicles that it can't use because its software isn't ready.
 
uber老板太坏!本来就欺压司机,现在又用自动驾驶碾压司机。
 
所谓富贵险中求,鄙视主楼的观点。
 
后退
顶部