This Cruise accident in SF is confirming my suspicions about autonomous software (Page 1/4)
Kitskaboodle JAN 24, 11:03 PM
I have pondered for years about this:
When car manufacturers create software for their cars, (for self-driving mode) are they accounting for / programming for “accident avoidance”?
Sometimes I question this. Here is an example of what I mean. About a week ago a “Cruise” autonomous taxi collided with a SF fire truck in the middle of an intersection in downtown SF. What happened is that as the fire truck was coming up on the intersection, there was a car in front of it. So, the fire truck swerved around the car on the left as they went through the intersection and attempted to cut back to the right side as they exited the intersection but the Cruise taxi kept coming through the intersection in its own lane and collided with the fire truck.
Here is my point: I assume that these software guys are programming these cars “according to the rules of the road” (for the most part) but life on the road is not always according to the book. The fire truck had the right to swerve on the other side of the road to avoid cars and to keep moving. This makes me think that these cars cannot react/respond to anything outside the normal driving rules of the road.
What about things that intrude into your lane, your space suddenly like a dog, cat, squirrel or some other animal darting right in front of you? Can these cars think ahead, see bad drivers behind, beside you or in front of you doing stupid stuff like being on their phone, riding their brakes, driving too slow, swerving, etc. and then take action to avoid that obstacle or get away from it? I think not……
I was taught to scan the road ahead and watch out for road hazards, keep safe braking distances, stay clear of poor drivers, if someone tail gates, merge over and let them by, etc
Why do I mention all these points? Because I really question whether they “scan the road ahead to look for any hazards” and actively practice “accident avoidance”.
Lastly, are autonomas cars really better and safer than a real person/driver who has had many years of experience on the road dealing with all the unpredictability of drivers on our roads?
What do you think about self-driving software?
Kit

[This message has been edited by Kitskaboodle (edited 01-24-2024).]

Cliff Pennock JAN 25, 06:43 AM
I've been closely following the development of self driving cars and as it is right now, they are already safer than human drivers. The problem is that whenever an accident happens with autonomous cars, it's immediately front page news and people jump on the "autonomous cars are unsafe!" bandwagon. The thing is, those accidents that hit the news, are the only accidents.

There are almost 7 million [vehicle] accidents in the US each year. Number of fatalities: between 40.000 - 45.000. Number of "driverless" car accidents: slightly over 100. Fatalities: between 10-15.

Now of course there are many more "normal" vehicles than autonomous vehicles. But if you do the math, you will find that autonomous cars are much, much safer. Like 250 times safer.

Tesla's current self driving software (which is still in Beta), is moving away from pure software solutions and is switching to AI. AI will already know everything the current system knows, but will learn as it encounters situations not yet in the "database". And the nice thing is that what one car learns, will be added to the knowledge of all other cars.

I think autonomous cars are a good thing. Nothing is of course 100% fool-proof and accidents will still happen. But even in those current 100 accidents per year, most of the time a human driver is at fault and could not have been prevented if the autonomous car was a conventional car instead.
82-T/A [At Work] JAN 25, 07:59 AM

quote
Originally posted by Kitskaboodle:

I have pondered for years about this:
When car manufacturers create software for their cars, (for self-driving mode) are they accounting for / programming for “accident avoidance”?
Sometimes I question this. Here is an example of what I mean. About a week ago a “Cruise” autonomous taxi collided with a SF fire truck in the middle of an intersection in downtown SF. What happened is that as the fire truck was coming up on the intersection, there was a car in front of it. So, the fire truck swerved around the car on the left as they went through the intersection and attempted to cut back to the right side as they exited the intersection but the Cruise taxi kept coming through the intersection in its own lane and collided with the fire truck.
Here is my point: I assume that these software guys are programming these cars “according to the rules of the road” (for the most part) but life on the road is not always according to the book. The fire truck had the right to swerve on the other side of the road to avoid cars and to keep moving. This makes me think that these cars cannot react/respond to anything outside the normal driving rules of the road.
What about things that intrude into your lane, your space suddenly like a dog, cat, squirrel or some other animal darting right in front of you? Can these cars think ahead, see bad drivers behind, beside you or in front of you doing stupid stuff like being on their phone, riding their brakes, driving too slow, swerving, etc. and then take action to avoid that obstacle or get away from it? I think not……
I was taught to scan the road ahead and watch out for road hazards, keep safe braking distances, stay clear of poor drivers, if someone tail gates, merge over and let them by, etc
Why do I mention all these points? Because I really question whether they “scan the road ahead to look for any hazards” and actively practice “accident avoidance”.
Lastly, are autonomas cars really better and safer than a real person/driver who has had many years of experience on the road dealing with all the unpredictability of drivers on our roads?
What do you think about self-driving software?
Kit




Technically, all forms of self-driving cars are using "AI" ... they use everything from ComputerVision (object detection) to various other forms of machine learning algorithms. The biggest problem with AI right now is that AI lacks reasoning. A semi-conscious human has the ability to make deterministic decisions based on "common sense," which is something that's inherent in our ability to think. An AI "learning model" is only as good as the information it's fed... and unless there's something in it that allows it to be able to anticipate (prediction)... it cannot infer what to do... meaning, it can only respond in the immediate sense (immediate collision avoidance where possible).

All of that said... there will come a time when these LLMs have accounted for almost everything ... to the point where society will consider a human driving to be a greater risk than an AI-controlled car. I suspect that we'll likely see this in a lot of fields, including medical surgeries, etc. It's kind of the natural progression of things. The important thing is that we ensure that we maintain the *right* to drive... and not be pressured against it through wild insurance costs and legislation that makes it obscenely cost-prohibitive.
Mickey_Moose JAN 25, 02:15 PM
lol

maryjane JAN 25, 02:37 PM
I'd be more suspicious of door plugs flying off Boeing airplanes.
Patrick JAN 25, 10:46 PM

quote
Originally posted by 82-T/A [At Work]:

...to the point where society will consider a human driving to be a greater risk than an AI-controlled car.



We're already there. Take a look around you when you're out on the road. Too many dick-wads "hiding" their phones in their lap, paying more attention to some brainless tweet than to traffic conditions. It won't be long before we look back on pre-AI controlled cars and wonder how the hell we ever made it safely from Point A to Point B.

[This message has been edited by Patrick (edited 01-25-2024).]

maryjane JAN 26, 01:51 PM

quote
Originally posted by Patrick:

We're already there. Take a look around you when you're out on the road. Too many dick-wads "hiding" their phones in their lap, paying more attention to some brainless tweet than to traffic conditions.



How could you know this if you aren't looking around at what brainless dik-wad people are doing instead of paying attention to your own traffic conditions?
Patrick JAN 26, 08:36 PM

quote
Originally posted by maryjane:

How could you know this if you aren't looking around at what brainless dik-wad people are doing instead of paying attention to your own traffic conditions?



Simple... "looking around" and being aware of how many distracted dick-wads are around me while I'm stopped at a traffic light is part and parcel of "paying attention" to my "own traffic conditions". So there!

[EDIT] I should add that I actually notice this the most while looking out of my kitchen window. Seriously! I live on a corner lot located at the intersection of a side-street and a fairly busy street, and my kitchen window gives me a great vantage point. I can't believe the number of drivers I observe who are effing around with their phones while in the midst of turning off of the busy street onto the side-street (and vice-versa). Yes, it's illegal here to be using a cell phone while driving, but that doesn't stop these idiots.

[This message has been edited by Patrick (edited 01-27-2024).]

82-T/A [At Work] JAN 27, 09:06 PM

quote
Originally posted by Patrick:

We're already there. Take a look around you when you're out on the road. Too many dick-wads "hiding" their phones in their lap, paying more attention to some brainless tweet than to traffic conditions. It won't be long before we look back on pre-AI controlled cars and wonder how the hell we ever made it safely from Point A to Point B.




Yes, but that's not what I meant. I meant there's going to be a time when "society" views a human driving a car as being more dangerous than autonomous control. At that point, I suspect driving will likely be priced out through things like insurance mandates and what have you. We're probably at least 20 years away from that.
BingB JAN 28, 10:04 AM
In the United States there are already driverless tractor trailers running on our highways. Most people don't know this.

Not only are they safer than human drivers (at least on highways for now) but they can also drive almost 24 hours a day. One single driverless truck will not take the place of one truck driver. It will replace almost 3. This will result in HUGE savings for trucking companies.

Truck driving is one of the biggest trades in America paying excellent wages. But the same thing used to be said about welders until they were all replaced in manufacturing by robots 20-30 years ago.