> Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they get this problem fixed and can demonstrate it with specific tests.”
This commentor must misunderstand the situation. School busses regularly stop to pickup and drop off students on streets near where they live; and there's generally schools all around. If Waymos can't properly respond to school bus signals, they need to not operate in areas where these pickups and drop offs happen, which is not exclusively near schools.
If a cop notices this, who gets the ticket? Asking because I’ve noticed Waymos starting to go above the speed limit now. They’re generally just matching the flow of traffic like everyone else, but it does raise the question: who gets fined? And if the fleet as a whole racks up more than 4 points in 12 months, would Waymo loose it's license similar to human drivers?
I saw a waymo go in a nonexistent rightmost lane at a stop light, I thought it was going to turn but it instead proceeded to go forward and force the driver in the actual rightmostlane to break to allow it to merge else it would have caused an accident as there was no lane in front of it.
This was on El Camino in Santa Clara. I was highly suprised as I was under the assumption they were pretty much production ready as they have been expanding their area a lot.
Use statistical incidence rates and not "i saw a thing.." to make that call. I mean I'm sure most drivers regularly think "wow maybe humans shouldnt be allowed to drive" every time they go out on the road.
The thing about human drivers is we’re all unique little stupid snowflakes.
If a software powered car is vulnerable to a certain condition, presumably, all running that software system are. The rare day we can generalize a bad driving story, in fact.
I don't think this checks out. Would the model do the same thing when presented with the exact same inputs? Yes. Is it more likely to do the same thing at the same intersection? Probably. But if you repeat a similar setup somewhere it might not. Bad behavior still exists and should be fixed, but it doesn't mean they're bad drivers in general.
People have trouble seeing outside of their own biases and understanding how different another view can be with a different background and context to the situation. I have no problem confidently saying the parent poster has definitely made worse and more questionable driving decisions under more constrained and more dangerous situations on the road, and then never thinks twice about it after that moment because it had no consequences. All they need to do is look at driver safety statistics of autonomous vehicles vs humans to immediately reject their flawed understanding, and they never will.
Luckily, cars and driving in general aren't enshrined as an early amendment of the constitution (in the US) and aren't even considered a legal right, so pushback to change won't be artificially inflated several decades by heavily motivated interest groups seeking to spread misinformation about their safety. Not a bang, but a whimper.
You're missing that the difference is incentives, specifically perverse incentives being scaled up. If we were talking about an individual hacker who programmed their car for automated driving and it made the above wrong decision, people would straightforwardly attribute fault to the individual. The problem here is that large corpos, who will eagerly tout their perogative to do whatever they want as long as it's within the law, going beyond even that and breaking the law with impunity.
We can easily imagine a crash from such a thing being declared "no fault" (or even the fault of the turning driver!) based on corpo-sympathetic police, judiciary, and regulators who have succumbed to the inevitable "computer can't be wrong". That perceived lack of justice is the problem - when another individual does something wrong (either accidentally or willful) and gets away with it, we can brush it off as their bad behavior will eventually catch up to them. Whereas with corpos it has been thoroughly demonstrated that this will not happen.
That page addresses tort liability, not liability for driving infractions or crimes. Liability for damages when a company does it is more settled of a situation.
It still isn't quite as clear who or if anyone is liable when traffic laws are broken:
Sounds like the tickets should be at least more expensive than the cost of equivalent QA (and if not, self driving companies might offload QA to the police).
I remember when they told us that autonomous cars wouldn’t break laws and wouldn’t speed.
I always felt this was just a strategy, and that soon enough fleet operators would turn up the dials on speed and aggressiveness. After all, the only people who can complain are the people outside the car, and they will be dead.
There are highways in the US where drivers regularly go 10-20 over the speed limit, if not more; maintaining the speed limit on a road that's labeled as 45MPH zone, but is treated as a 65, will be dangerous for everyone involved, both the cars approaching the slowpoke at 20+ miles an hour, and the slowpoke itself.
I don't know how Waymo is going to square that circle.
That's Phoenix, it's here. Waymos commit to nominally keep the speed at the speed limit but it is _extremely noticeable_ that that's the case because literally NO ONE drives 65 on the freeways here. Everyone is at minimum at 74. It's a rite of passage in Arizona. It's not even a speeding ticket until 75. Goes back to the 70s with the feds trying to force speed limit laws or threatening to revoke highway funding. Arizona said "fine, but it's not a speeding ticket. it's 'misuse of a finite resource.'"
So you'll see the Waymos kind of puttering along at 65 as everyone zooms around them. They DO say they'll occasionally exceed speeds when it's safer to do so, but it's obvious they don't want a narrative of them being speed demons and flying around exceeding the speed limit.
I used to live in a place where this was common -- the issue was not just speed, but a general disregard for traffic law because traffic law was unenforced. You could be going 50 in a 35 and someone would aggressively pass you. At some point, the road is simply occupied by unsafe drivers and there's not much you can do other than hold your line and be as predictable as possible to the aggressive drivers around you.
I understand this phenomenon and experienced it when I used to drive. What I found so revealing was it ultimately meant that the people weren’t actually driving their cars.
Each ostensibly independent driver was being forced to drive a certain way by the most aggressive driver behind them, and in turn they were required to force the driver ahead of them to drive in the same way.
> a road that's labeled as 45MPH zone, but is treated as a 65
If this is the case, then the speed limit is too low. To control speed on such a road you either need draconian enforcement or you need to change the road so people aren't comfortable driving that fast. Make the lanes narrower, introduce lane shifts or reduce the number of lanes, etc.
A large problem in speed limit setting is that 85th percentile is used many times for setting the speed limit and other factors are ignored or aren't weighted as heavily.
It's a very fuzzy practice, and I think as we continue towards an automated driving world, we need to be more critical of how speed limits are set.
Using the 85th percentile as a means to determine speed limits ends up with 15% of all drivers exceeding the speed limit, or worse, more drivers exceed the speed limit than those original 15% because they know consequences may be rare.
Sometimes bad road design (e.g. lanes too wide) are to blame, but in miserable neighborhoods with no traffic enforcement at rush hour you can also end up in a situation where the majority of people on the road are simply aggressive drivers who are familiar with the road. At some point you do need to enforce the law if it isn't being respected. There is a growing subset of people in the US who not only disregard traffic law but pride themselves in a distain for it.
IDK if it's draconian but speed cameras or simply forcing cars to have modules that report speeds at certain points and issue fines automatically should be standard by now. What's the point of having smarter cars if they can't be forced to stay below the legal speed limit.
I don't think building enforcement into cars would be a good idea, or even effective, but a few speed cameras work wonders for changing the overall 'temperature' of driving in an area.
There's a road near me that just dropped the speed limit to 40. This is a divided road, two 12-foot lanes in each direction, good visibility, with turning lanes at intersections. It's highway-class. Most people drive 55 or 60, because that speed feels appropriate and reasonably safe (search the "85th percentile" rule in setting speed limits to read more about this).
By reducing the speed limit to 40 the road is IMO less safe, because there are always some people who very conscientiously do not exceed the posted speed limit. So now you have some people driving 40, while most people still want to go 55 or 60. This creates an unsafe mix of vehicle speeds.
>After all, the only people who can complain are the people outside the car, and they will be dead.
I'm not sure how you can earnestly make this claim while reading people complaining about the speed and aggressiveness. Do you suspect you're replying to ghosts?
Tesla specifically programmed their self driving mode to roll through stop signs without stopping. I don't think anyone has believed the claims of the self driving marketers for a long time now.
People are getting wise they can abuse these cars on the road, cut them off, not let them in. Waymo needs to respond like other drivers in the city if they want to merge lanes, force their way into the lane and demand space is created.
> Asking because I’ve noticed Waymos starting to go above the speed limit now
Where at? Im curious because I see a lot of people say this, but Ive never seen them go more than 1mph over the limit when riding in them, and watch them do 65 on the freeway every day, even when people are passing.
And school busses go all sorts of places carrying kids to field trips and sporting events. Along with police/fire/ambulances, school busses are just another special type of vehicle that ALL drivers must learn to deal with. If you cannot act properly around a school bus, you shouldnt be on the road.
(Funny story: i was in Ottawa over the winter. There, snow plows, ambulances and fire trucks all use blue flashing lights. I thought i was being pulled over by a giant police truck ... it was a snow plow that really did not appreciate me stopping on the side of the road. Yet another special case vehicle.)
So.. it sounds like they're doing a lot better to me? 19 cases in the fall, 4 between the recall in Novemberish and Jan, and 1 between them and now that occurred in Jaunary?
Also lol at this quote in the article "Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating." What it doesn't note is that the other 5 seem to have been human driven passenger vehicles. From the NTSB report: "located in Novi, Michigan, replied “No” to the prompt. The ADS-equipped vehicle then resumed travel and passed the school bus while its stop arms were still extended. A passenger vehicle following the ADS-equipped vehicle similarly passed the school bus. In total, six vehicles passed the school bus while it was stopped. A crash did not occur.", so it sounds to me like 4 people passed it, waymo was like wtf I'm pretty sure that's a stopped bus, a human incorrectly identified it as not a bus, waymo passed it, and then one more person passed after the waymo.
> A preliminary report by the NTSB published in early March found that one ensuing incident, on January 12, occurred after a Waymo remote assistant, a Michigan-based human tasked with “helping” the software when it was struggling on the road, incorrectly told the robotaxi that the school bus ahead of it didn’t have active signals on. Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating.
I will let you judge for yourself here what the "right" thing for the Waymo to do was... but let's think critically about how Waymos work in the real world, benchmarked against other real drivers dealing with real life issues.
Obviously unacceptable to flaut the law. I do wonder what the risk profile is. Obviously kids can be eratic and unexpected, and coming out racing from behind a flat nosed pusher bus wouldn't be totally unheard of. But also do low key wonder if the Waymo's response time and speed might be enough so that there's not much real risk. The law and expectations ought be followed! But I am low key curious too, if perhaps the Waymo's infinite attention & seeming caution would mitigate the risk adequately.
The fact that it is passing stopped school buses does rather suggest that perhaps as cautious as it is, it still isn't smart enough to be cautious in the right ways.
The unfortunate situation is that self-driving vehicles need to be fantastically superior to human drivers to secure the public trust.
Self-driving vehicles that are much better than human drivers aren't enough.
It's similar to making alternative software targeting an entrenched incumbent. The disruptor needs to add value that overcomes the friction of switching at a minimum and then more to make it worthwhile.
So I assume Waymo will be immediately banned from any residential areas until they can demonstrate the ability to follow the laws of the road?
The problem is there is zero enforcement. We know the vehicle is not safe around schoolchildren so the appropriate incentive needs to be applied to get the issue addressed.
65 comments
> Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they get this problem fixed and can demonstrate it with specific tests.”
This commentor must misunderstand the situation. School busses regularly stop to pickup and drop off students on streets near where they live; and there's generally schools all around. If Waymos can't properly respond to school bus signals, they need to not operate in areas where these pickups and drop offs happen, which is not exclusively near schools.
This was on El Camino in Santa Clara. I was highly suprised as I was under the assumption they were pretty much production ready as they have been expanding their area a lot.
If a software powered car is vulnerable to a certain condition, presumably, all running that software system are. The rare day we can generalize a bad driving story, in fact.
Luckily, cars and driving in general aren't enshrined as an early amendment of the constitution (in the US) and aren't even considered a legal right, so pushback to change won't be artificially inflated several decades by heavily motivated interest groups seeking to spread misinformation about their safety. Not a bang, but a whimper.
We can easily imagine a crash from such a thing being declared "no fault" (or even the fault of the turning driver!) based on corpo-sympathetic police, judiciary, and regulators who have succumbed to the inevitable "computer can't be wrong". That perceived lack of justice is the problem - when another individual does something wrong (either accidentally or willful) and gets away with it, we can brush it off as their bad behavior will eventually catch up to them. Whereas with corpos it has been thoroughly demonstrated that this will not happen.
> when a Waymo vehicle is driving itself, Waymo may be legally considered the operator, even if a human passenger sits inside
Source: https://www.vazirilaw.com/faqs/whos-liable-in-a-waymo-self-d...
It still isn't quite as clear who or if anyone is liable when traffic laws are broken:
https://web.archive.org/web/20251025055924/https://www.nytim...
Often, they are simply getting away with it.
Sounds like the tickets should be at least more expensive than the cost of equivalent QA (and if not, self driving companies might offload QA to the police).
I always felt this was just a strategy, and that soon enough fleet operators would turn up the dials on speed and aggressiveness. After all, the only people who can complain are the people outside the car, and they will be dead.
I don't know how Waymo is going to square that circle.
So you'll see the Waymos kind of puttering along at 65 as everyone zooms around them. They DO say they'll occasionally exceed speeds when it's safer to do so, but it's obvious they don't want a narrative of them being speed demons and flying around exceeding the speed limit.
Each ostensibly independent driver was being forced to drive a certain way by the most aggressive driver behind them, and in turn they were required to force the driver ahead of them to drive in the same way.
> a road that's labeled as 45MPH zone, but is treated as a 65
If this is the case, then the speed limit is too low. To control speed on such a road you either need draconian enforcement or you need to change the road so people aren't comfortable driving that fast. Make the lanes narrower, introduce lane shifts or reduce the number of lanes, etc.
It's a very fuzzy practice, and I think as we continue towards an automated driving world, we need to be more critical of how speed limits are set.
Using the 85th percentile as a means to determine speed limits ends up with 15% of all drivers exceeding the speed limit, or worse, more drivers exceed the speed limit than those original 15% because they know consequences may be rare.
https://www.ite.org/technical-resources/topics/speed-managem...
>
If this is the case, then the speed limit is too low.I don't disagree with you, but it's still a problem if there are drivers on that road who are driving so slowly as to be unsafe, robot or human.
There's a road near me that just dropped the speed limit to 40. This is a divided road, two 12-foot lanes in each direction, good visibility, with turning lanes at intersections. It's highway-class. Most people drive 55 or 60, because that speed feels appropriate and reasonably safe (search the "85th percentile" rule in setting speed limits to read more about this).
By reducing the speed limit to 40 the road is IMO less safe, because there are always some people who very conscientiously do not exceed the posted speed limit. So now you have some people driving 40, while most people still want to go 55 or 60. This creates an unsafe mix of vehicle speeds.
> turn up the dials on speed and aggressiveness
You literally cannot drive on public roads unless you match the speed, flow, and maneuvering of other traffic.
>After all, the only people who can complain are the people outside the car, and they will be dead.
I'm not sure how you can earnestly make this claim while reading people complaining about the speed and aggressiveness. Do you suspect you're replying to ghosts?
> Asking because I’ve noticed Waymos starting to go above the speed limit now
Where at? Im curious because I see a lot of people say this, but Ive never seen them go more than 1mph over the limit when riding in them, and watch them do 65 on the freeway every day, even when people are passing.
(Funny story: i was in Ottawa over the winter. There, snow plows, ambulances and fire trucks all use blue flashing lights. I thought i was being pulled over by a giant police truck ... it was a snow plow that really did not appreciate me stopping on the side of the road. Yet another special case vehicle.)
Also lol at this quote in the article "Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating." What it doesn't note is that the other 5 seem to have been human driven passenger vehicles. From the NTSB report: "located in Novi, Michigan, replied “No” to the prompt. The ADS-equipped vehicle then resumed travel and passed the school bus while its stop arms were still extended. A passenger vehicle following the ADS-equipped vehicle similarly passed the school bus. In total, six vehicles passed the school bus while it was stopped. A crash did not occur.", so it sounds to me like 4 people passed it, waymo was like wtf I'm pretty sure that's a stopped bus, a human incorrectly identified it as not a bus, waymo passed it, and then one more person passed after the waymo.
> A preliminary report by the NTSB published in early March found that one ensuing incident, on January 12, occurred after a Waymo remote assistant, a Michigan-based human tasked with “helping” the software when it was struggling on the road, incorrectly told the robotaxi that the school bus ahead of it didn’t have active signals on. Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating.
I will let you judge for yourself here what the "right" thing for the Waymo to do was... but let's think critically about how Waymos work in the real world, benchmarked against other real drivers dealing with real life issues.
"A School District Tried to Help Train Waymos to Stop for School Buses. It Didn’t Work."
The fact that it is passing stopped school buses does rather suggest that perhaps as cautious as it is, it still isn't smart enough to be cautious in the right ways.
Self-driving vehicles that are much better than human drivers aren't enough.
It's similar to making alternative software targeting an entrenched incumbent. The disruptor needs to add value that overcomes the friction of switching at a minimum and then more to make it worthwhile.
The problem is there is zero enforcement. We know the vehicle is not safe around schoolchildren so the appropriate incentive needs to be applied to get the issue addressed.