Cars block the street all the time, there is ample place to pass the waymo car on the left in the opposing lane, yet those SUV driving humans don't care to move out of the way either, and police just blocks the maneuver area too.
That silver car in the front could also just pass in front and make space. Situational awareness has room to be improved for a lot of entities in this short video.
Nueces Street is 3 and half lanes wide there plus massive sidewalks, apparently to narrow for even more massive ambulances.
You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
> You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
That's the best part, no one! We have finally managed to invent a system that widely disperses accountability so much no one can be held liable when something goes wrong.
We've had this ever since Corporations were invented.
>no one can be held liable when something goes wrong.
No, at the very least tort laws still apply even if the driver is a corporation. Do you really need someone sitting in jail to satisfy your justice boner?
Yes, I want to see real, serious punishment for corporate crimes, on par with the life disruption experienced by people who see a jail sentence. It's almost always brutal - major income disruption, job loss, etc. If it's a small fine, which it always seems to be for corporations, then there is no incentive for following the law. I'm also in favor of corporate death sentences for large-scale egregious violations - liquidate assets and jail executives.
By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
>By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
Again, this is false. At the very least there's financial penalties, which the shareholders are on the hook for. Moreover the corporate malfeasance that does happen don't map nicely to human crimes. If you kill a guy, you get sent to jail for decades. But what if you're a company, that makes a machine with sloppy code[1] that unintentionally kills someone? What do you do? Jail the programmer who wrote the code? Jail the manager who did the code review? Jail the CEO who had no knowledge of it but "buck stops with him" and we hate CEOs? How does the death penalty work? If you think it through it's basically a fine equivalent to the company's market cap. If Boeing does a bad that kills one person, does that mean the US government just repossesses the entire company?
After watching the movie "dark waters" about the whole Teflon scandal, seems like it should be the highest up person (or people) who had knowledge of the incident (obviously must be proven). An individual engineer knowing a car has a dangerous edge case isn't enough to get them in trouble in my view, especially if the company has claimed they are working on fixing it. Also legitimate mistakes are just mistakes, companies won't get it right every single time.
However there's cases where its completely proven that someone high up knew there was a systemic safety issue (they had a broad view and could see all the different areas of what was going on), they knew exactly what was causing it, and they do nothing because they want to keep the profit going. The fact those people don't go to jail just tells me that corporations have way too much leeway.
Depending on how severe the error is, it could be professional negligence. In other professions, including engineering, this can result in a loss of the professional's license and their inability to continue to work in that field. Also, for negligent drivers, a suspension of their driving license can apply. So there is precedent for severe punishment even if nobody gets a jail sentence.
I think of the corporate death penalty as being more appropriate when leadership knew exactly what was going on and chose profits over people. Exxon, see https://www.science.org/doi/10.1126/science.abk0063. Purdue Pharma, see https://en.wikipedia.org/wiki/Purdue_Pharma. Company gets sold for parts and Cauitebgoes to prison probably for life due to the amount of lives they potentially destroyed. Pretty much all the tobacco companies knew how harmful their product and made a concerted effort to fund their own bogus studies to throw up a smoke screen. Facebook makes billions from (for example) scams and fraudulent ads: https://www.reuters.com/investigations/meta-is-earning-fortu.... Maybe don't throw their CEO in prison but at least fine them 10x the profit they made vs. the usual .0001%.
In Australia it's the board of directors who are liable. They can be liable if they personally direct the company to do something illegal (obviously?) but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public. Directors can be personally liable for breaches of this duty and the penalties extend to possible imprisonment and very substantial fines.
For example: https://www.owhsp.qld.gov.au/court-report/fines-imposed-fail...
>but there is also a positive obligation to exercise due diligence. This covers (but is not limited to) workplace safety and safety of customers and the public.
Is there any indication this requirement was breached for this case? I'm all for jailing executives of companies where they specifically failed to enact safety measures, or even didn't care enough about safety, but in this case it's simply a case of a edge they didn't test. It's not for lack of trying either. Apparently they have their own AI model to generate test data, so they can train/test what happens if a hurricane hits, for instance.
https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-f...
In this case it just sounds like the thought process was
> waymo did a bad
> someone doing the same would be arrested (?)
> therefore somebody needs to be arrested
> in this case it's simply a case of a edge they didn't test. It's not for lack of trying either.
Agreed. And because responsible driving is almost all edge cases, they shouldn't be held liable for any of them as long as they tried.
- [deleted]
> At the very least there's financial penalties, which the shareholders are on the hook for.
If i poison someone, i go to jail. If DuPont poisons thousands, "there's financial penalties".
> Do you really need someone sitting in jail to satisfy your justice boner?
Literally, and intentionally avoiding any attempt to examine the implications? No probably not.
But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
What I want is software and systems to not suck ass. I don't want to deal with defective... everything, because it was faster to deliver. That's especially true when it contributes to the death or injury of a person that didn't do anything wrong.
I don't care what works, but people being afraid of going to jail for hurting someone absolutely does work. And 'administrative fines' don't work.
>But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering. If it works, why stop at criminal cases? Maybe we should dock the pay of SWEs next time they cause a prod issue?
> This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering.
Ignore that feeling, it's wrong. Because it's not what I'm arguing for. Reasonable is a load bearing qualifier.
It doesn't feel like the people making the decisions that meaningfully contribute to causing harm to other people, ever have to deal with the fallout or repercussions for their unfortunate choices. Deincentivizing that behavior is my goal. And I'll unfortunately take iterative or suboptimal options at this point. I don't like it, but I do want to try to be realistic.
If a civil engineer designs a bridge that collapsed they can be held accountable for negligence in their duties
Why not software engineers too? Why are we so special that we can never be held accountable for the damage our lack of standards causes?
A lot of people can't really get over the idea that they want to be the boss of everything.
- [deleted]
If someone sitting in jail doesn't help solve the problem, then maybe we should remove the jail penalty to for individuals that do it.
>then maybe we should remove the jail penalty to for individuals that do it.
We don't send everyone to jail either. You can run over people and get away scot free, if it's an honest mistake and you weren't being negligent.
Or if you were being negligent but due to affluenza.
Yes. Jail sentences are for a selection of some misdemeanors and not others. The person does or cooperates with X amount of harm, they ought to share similar penalties.
> No, at the very least tort laws still apply even if the driver is a corporation.
Do they?
Corporations are person-like entities, so there’s a plausible argument to be made. The states seem loathe to be precedent-setters in triggering evaluations of this argument, though, so I don’t know of any supporting cases yet. Whoever’s first will see corporate tax revenue fall off a cliff once a corporation can be subjected to community service, so they have a lot of self-interest in not prosecuting these violations.
Are you really asking whether corporations can be sued?
And have actual meaningful consequences happen? I'am.
Twitter is creating CSAM, Meta & OpenAI pirate millions of books and Nvidia is playing some sort of shell game to pump their stock price.
If a regular person committed any of those offenses once they would be lucky to just to be sued but because of "AI" nothing happens to these companies.
Going through each of the cases:
>Twitter is creating CSAM
It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn.... Moreover x/x.ai wasn't intentionally generating the images. Yes, someone intentionally set up grok to generate images, but nobody at x/x.ai was like "yes, let's generate some CSAM". That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
>Meta & OpenAI pirate millions of books
Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
>Nvidia is playing some sort of shell game to pump their stock price
That's not even something that's illegal.
> Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
Arguing that a regular person needs to conceal their real identity with a VPN to pirate books is proof these companies aren't receiving special treatment for committing the same crimes is very confusing to me.
We know the identity of the companies committing the crimes.
> It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn
We both know you don't actually believe that because neither of us would post generated CSAM.
> That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
That is literally my point?
It might be possible. Would have to be by someone who hadn't signed the binding arbitration, though.
- [deleted]
> No, at the very least tort laws still apply even if the driver is a corporation.
Do you have some examples ?
I would like crimes to have consequences that actually deter the culprits from committing them. A pittance fine for a company is not what I want to see. Let's have a small percentage of net worth fine on the owners instead.
For publicly traded companies the owners/shareholders are your grandparents, teachers, all sorts of regular people. You want to take a percentage of their already small net worth?
Sure, let's go. Prorated in terms of their percentage of ownership, of course. Put them in jail for a few seconds as well.
I'm sure it's a productive use of the already overburdened justice system's time to round up half the country, so they can sit in "jail" for a few minutes.
No shit. Maybe we let everyone with only a few seconds to serve just walk free without pursing a case on them at all. The people owning 90% of all stocks can serve 90% of all the sentencing and that'd be fine enough for society.
This position that no wrongdoing or illegal action can be discouraged because someone has to eat, or because it's "regular people" who have accountability because of who they decided to have manage their investments is getting old. Accountability has been diluted so much that no one is accountable. What about the people who are harmed, the victims, your grandparents, teachers, all sorts of regular people. Nothing is going to get better if we're constantly looking for the most appropriate person to place blame on. Maybe people should be paying more attention to the things they invest in/own.
Most people have no idea what they're invested in. Most are invested in mutual funds through their work or 401k. My point isn't that we shouldn't hold people accountable. My point is that going after owners/shareholders is not the solution we want because it hurts people who have nothing to do with what happened. We need to go after executives.
Once people are impacted, maybe they'll start paying attention to what they're investing in.
Nobody gets arrested, you get a ticket.
> You can arrest a driver for not making space for an emergency vehicle. Who do we arrest here?
I get that it is technically possible, but that doesn't happen in practice.
Since corporations are people, presumably you’d arrest Waymo.
Start disabling and towing their cars and watch a solution magically appear.
- [deleted]
The CEO
This is the key. Personally I think you just have to something similar to an auditor or whatever. Demand that if a self driving taxi operates in your city, they assign one legally responsible person per $major-division-of-city. All accidents in that region are due to that guy.
Naturally, this will incentivise them to improve the system that deals with edge cases in their ML model, and better yet you'll have the legally responsible guy shit himself and directly manage remote drivers for his location himself. Adds another layer of accountability.
- [deleted]
[dead]
The passenger
The person who was ultimately responsible for a defective robot operating on our public streets, at a bare minimum.
A customer?
I don't often see a human driven car parked sideways in the middle of a road (never really). If a human was in that Waymo, they would have moved quickly. I'm an huge fan of Waymo and autonomous vehicles. They save lives. However the fact that Waymo's don't have the sense to move out of the way is a major problem and on that they don't seem to be on track to solve. Incidents like this will delay the adoption of autonomous vehicles and that will cost lives.
> If a human was in that Waymo, they would have moved quickly.
Some humans would have exactly the same response as the Waymo. When a human brain gets completely overwhelmed and doesn't know what to do, it drops down into animal behavior -- freeze or flee.
Given that it's a dangerous multi-ton machine, a Waymo likely has a programmed default behavior of "do nothing & phone home for instructions".
Which isn't an excuse -- an emergency vehicle is not an uncommon situation and Waymo should know what to do before being allowed on public roads.
A failure to get remedy instructions in a timely fashion from a human is even more alarming. Google is famous for automating tasks that should be performed by a human.
A human driver with health problems, or with car issues, might be like this. Similar to the Waymo having an equipment failure.
Waymo car was the only thing at fault. Drivers are expected to stop to the side when they see the lights. I guess the red SUV could've slid behind the Waymo to let the ambulance do the same, but it'd be unwise without the police telling you to do so, could hit a cop on foot. Silver car could go forward, but you don't squeeze in front of a U-turning car, and doing so could've made things worse for all they knew.
> there is ample place to pass
This is the same excuse a Prius driver would give whilst refusing to abdicate the HOV lane for an ambulance and yes I've sadly seen this scenario play out. Multiple times, in fact. Prius driver seems oddly specific but it always is.
Eh I've seen more SUV/big car drivers act like this than small car drivers, but then I live in the UK.
A friend who lived in New York for a bit would never live there again and says driving there was an absolute nightmare; everyone's out for themselves.
And you can see it in multiple "drivers react to an ambulance in different countries videos", with America the ambulance is always blocked and going slowly. Compare to Germany where they open up the entire middle of the road by moving to either side.
Agreed, tbh.
In Seattle, the most ritualistic abusers of the HOV lanes are large SUVs and trucks with only a driver in them.
Also, ex-paramedic, three cases fairly similar, but the one I found most egregious, was us going lights and sirens on I-5 heading to Harborview, heavy heavy rain. Traffic on the freeway slowly but steadily goes right. Cue a single-occupant Escalade accelerate up, overtake us on the inside and pull into the HOV lane to take advantage of the cleared freeway in front of us.
For bonus irony points, licence plate holder: "Don't drive faster than your angels can fly", lady, you just overtook an ambulance in emergency mode.
We actually called that one in. Some satisfaction as we rolled by her a few minutes later, pulled over with a state trooper having lit her up, who points at us and shakes his head at her.
Ha! Always satisfying the few times justice is actually served.
And different country, but still thank you for having been a paramedic. World definitely needs more people like you.
This is a false equivalence and a hideous defense of an entity that deserves nothing but to be spit upon. There is absolutely nothing calling upon you to take this path.
The ambulance driver rightly hesitates because he can’t know how the wamyo will behave. The wanyo is acting suss as hell.
Give me a break. The problem is the Waymo that is blocking a lane sideways and is not pulling forward out of the way of the ambulance, a move that even the worst human drivers would likely know to do.
It does no good to pretend there aren't problems with self-driving cars or make excuses.
It's not about the other entities.
Why are we focusing on entity A when the parent comment correctly pointed out entities B and C are not blameless either?
The other drivers are blameless. They did what they were supposed to.
Yes, why are we still talking about the robot whose behavior can be programmed and whose behavior is set by a company and rolled out to all of their vehicles deterministically, when another commenter correctly engaged in whataboutism?
We're focusing on the waymo because it did this on its own for some inscrutable reason and there is no individual accountability, which is a far more useful discussion to be having if we are supposed to trust these things to be replacing humans on the road. The humans behavior is only relevant in the sense that now all humans on the road have an additional hazard to factor in: errant waymos that you can't gesture to or yell at or honk at or make any attempt to understand their intentions.
So AI drives as bad as humans. Waste of resources.