Go Back  Bike Forums > Bike Forums > Advocacy & Safety
Reload this Page >

Night Risk Mitigation for Tesla Autopilot [Limited Scope]

Notices
Advocacy & Safety Cyclists should expect and demand safe accommodation on every public road, just as do all other users. Discuss your bicycle advocacy and safety concerns here.

Night Risk Mitigation for Tesla Autopilot [Limited Scope]

Old 09-11-22, 02:51 PM
  #1  
flangehead
Senior Member
Thread Starter
 
Join Date: Dec 2006
Location: Houston, TX
Posts: 903

Bikes: 2017 Co-op ADV 1.1; ~1991 Novara Arriba; 1990 Fuji Palisade; mid-90's Moots Tandem; 1985 Performance Superbe

Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 385 Post(s)
Liked 562 Times in 328 Posts
Night Risk Mitigation for Tesla Autopilot [Limited Scope]

[Limited scope: I want to try something here to see if it can catch on in A&S. I'd like to limit this thread, voluntarily, to ideas about actions that cyclists who have no choice but to ride on the street at night can take to reduce their risk due to this threat. Objective is to make this A&S thread more useful.]

Two Teslas on autopilot have recently plowed into motorcyclists:


Who knows what lights the motorcycles had facing rearward, but here's a random Yamaha.

Both motorcyclists were in HOV lanes on probably pretty big (Yamaha V Star, Harley) motorcycles. The recommendation from the U-tuber is to weave from side to side when a vehicle is approaching from behind at night.

I don't frequent HOV lanes, but I don't have a lot of confidence that the Tesla bug is limited to that situation...

I know in my case I have a triangle of LED red lights running with one blinking, and I have a rear-facing helmet light that I either have on flashing mode (significant traffic) or turn on when I see a vehicle approaching from behind (sparse traffic). Maybe that is a visual that a Tesla autopilot will pick up on.. I sure hope so.

Any other ideas on what I can do to reduce the risk of getting Tesla'd? I've watched a motorist in my area never look up from their lap for a mile and a half, so this isn't some kind of theoretical possibility.
flangehead is offline  
Old 09-11-22, 03:15 PM
  #2  
JW Fas
Cop Magnet
 
JW Fas's Avatar
 
Join Date: Oct 2018
Posts: 331
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 240 Post(s)
Liked 274 Times in 128 Posts
I'm surprised there isn't more outrage from Tesla customers. Tesla recently raised the price of its auto pilot option, yet they've removed tech from it.
JW Fas is offline  
Old 09-12-22, 01:29 PM
  #3  
I-Like-To-Bike
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,950

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,517 Times in 1,031 Posts
Originally Posted by JW Fas
I'm surprised there isn't more outrage from Tesla customers. Tesla recently raised the price of its auto pilot option, yet they've removed tech from it.
I would expect more outrage from people who are NOT Tesla customers, but rather may become victims of defective and prototype so-called "autonomous" driving systems being tested on the public highways by Tesla and its ilk much to the delight of ardent technophiles and financial speculators.
I-Like-To-Bike is offline  
Old 09-12-22, 03:03 PM
  #4  
noimagination
Senior Member
 
Join Date: Oct 2016
Posts: 718
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 362 Post(s)
Liked 414 Times in 244 Posts
I'm no technophile myself. It is obvious that any bug needs to be addressed quickly, throughout the lifecycle of the product but particularly at the development stage. during early alpha/beta testing and during early actual use. I agree that current laws and regulations are not adequate, and it would seem obvious that features taking decision making out of the hands of a human driver should not be implemented until adequate laws and regulations regarding how these "features" are developed , including what types of road "hazards" must be detected and under what conditions/circumstances; what other performance standards they need to meet; what post-market data collection and analysis is required to detect possible bugs; what standards for protection against malware must be implemented; etc. Industry as a whole, and Tesla in particular, have amply demonstrated through their actions over many years that they cannot be trusted to develop safe products without oversight.

However, given the abysmal performance of many drivers, one has to ask how much worse autonomous vehicles will be than human drivers. Right now I'd agree that autonomous systems need development, that the companies that develop them need to be watched closely and that such companies need to be called to account when their products are obviously, or even potentially, deficient. The stakes are very high, particularly for vulnerable road users like bikes, motorcycles, pedestrians...
noimagination is online now  
Old 09-12-22, 03:32 PM
  #5  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by noimagination
I'm no technophile myself. It is obvious that any bug needs to be addressed quickly, throughout the lifecycle of the product but particularly at the development stage. during early alpha/beta testing and during early actual use. I agree that current laws and regulations are not adequate, and it would seem obvious that features taking decision making out of the hands of a human driver should not be implemented until adequate laws and regulations regarding how these "features" are developed , including what types of road "hazards" must be detected and under what conditions/circumstances; what other performance standards they need to meet; what post-market data collection and analysis is required to detect possible bugs; what standards for protection against malware must be implemented; etc. Industry as a whole, and Tesla in particular, have amply demonstrated through their actions over many years that they cannot be trusted to develop safe products without oversight.

However, given the abysmal performance of many drivers, one has to ask how much worse autonomous vehicles will be than human drivers. Right now I'd agree that autonomous systems need development, that the companies that develop them need to be watched closely and that such companies need to be called to account when their products are obviously, or even potentially, deficient. The stakes are very high, particularly for vulnerable road users like bikes, motorcycles, pedestrians...
For some odd reason, we give a pass to drivers that kill others... And this happens roughly 40,000 times a year. Oh sure, some drivers are arrested and convicted... but that generally occurs only in cases of alcohol or drug abuse associated with those drivers.

On the other hand, we now seem to demand that AVs be 100% error free... not just slightly better than human drivers or half as bad as human drivers... (like say 20,000 deaths a year), but no, AVs have to be nearly perfect. 99% perfect would still equate to 400 deaths a year. Will that be good enough? I doubt it. And certainly 90% better than humans would be 4000 deaths a year... no way that would be considered acceptable. But hey, 40,000 deaths by errant humans is "just an accident," right?
genec is offline  
Likes For genec:
Old 09-12-22, 03:57 PM
  #6  
Bald Paul
Senior Member
 
Bald Paul's Avatar
 
Join Date: Sep 2017
Location: Upstate SC
Posts: 1,680
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 806 Post(s)
Liked 1,612 Times in 764 Posts
Let's get one thing straight - there are no 'self driving' cars. California is currently going after Tesla for advertising their vehicles as "self driving", but if you look at Tesla's website, you'll see this (I highlighted the important part) :



Before retiring, I was a regional technical specialist and instructor for an import auto manufacturer (28 years.) That manufacturer developed a camera-based system that helped avoid accidents by auto-braking and even auto-steering the car if needed. It was NOT an autonomous vehicle, nor was it advertised as such. However, those of us who truly knew and understood the system also knew and understood it's limitations, and its ability to "see" motorcycles, bicycles, and at times, pedestrians was not 100% guaranteed.
Bald Paul is offline  
Old 09-12-22, 04:19 PM
  #7  
I-Like-To-Bike
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,950

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,517 Times in 1,031 Posts
Originally Posted by genec
On the other hand, we now seem to demand that AVs be 100% error free... not just slightly better than human drivers or half as bad as human drivers... (like say 20,000 deaths a year), but no, AVs have to be nearly perfect. 99% perfect would still equate to 400 deaths a year. Will that be good enough?
How close to perfect do you believe the Tesla Autopilot System were when they ran over the motorcyclists even with with its allegedly required fully attentive drivers, (who allegedly truly know and understood the system), with their hands on the wheel and prepared to take over as allegedly required by the Autopilot System to operate?

You are either guessing or just parroting the hype when you imply that there is any evidence or quantitative data about any degree of perfection or error-free operation or comparison with real world human drivers of any of the current crop of so-called "self-driving" systems being tested on public highways.
I-Like-To-Bike is offline  
Likes For I-Like-To-Bike:
Old 09-13-22, 03:58 PM
  #8  
Gear_Admiral 
Newbie
 
Gear_Admiral's Avatar
 
Join Date: Sep 2022
Location: Honam
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Likes: 0
Liked 15 Times in 9 Posts
Originally Posted by I-Like-To-Bike
How close to perfect do you believe the Tesla Autopilot System were when they ran over the motorcyclists even with with its allegedly required fully attentive drivers, (who allegedly truly know and understood the system), with their hands on the wheel and prepared to take over as allegedly required by the Autopilot System to operate?

You are either guessing or just parroting the hype when you imply that there is any evidence or quantitative data about any degree of perfection or error-free operation or comparison with real world human drivers of any of the current crop of so-called "self-driving" systems being tested on public highways.
The disconnect that some autonomous vehicle fans have ...

These cars are, at *best*, another decade away. Musk has been touting full self-driving "next year" since 2014. Look at Waymo or any auto manufacturer like Honda doing this kind of research. They are projecting it way out in the future. Musk doesn't have an advanced degree in anything STEM. He's a con man. He doesn't have some secret sauce that no one else has. I, like you, do not demand perfection, but I do demand better than "a small fraction as good as current drivers." I do not discount the possibility of superior AI-controlled cars. I just don't want them that much and I don't want to hold my breath waiting for something possibly decades away.

To the OP's point: there is nothing you can do beyond leaving or giving up on getting around outside of a big car yourself. California for one is letting a dangerous, negligent car salsesman pre-alpha test self-driving vehicles of multiple tons that can accelerate 0-60 in 3 seconds on the public roads.
Gear_Admiral is offline  
Likes For Gear_Admiral:
Old 09-21-22, 07:23 AM
  #9  
livedarklions
Tragically Ignorant
 
livedarklions's Avatar
 
Join Date: Jun 2018
Location: New England
Posts: 15,613

Bikes: Serotta Atlanta; 1994 Specialized Allez Pro; Giant OCR A1; SOMA Double Cross Disc; 2022 Allez Elite mit der SRAM

Mentioned: 62 Post(s)
Tagged: 0 Thread(s)
Quoted: 8186 Post(s)
Liked 9,094 Times in 5,053 Posts
Originally Posted by genec
On the other hand, we now seem to demand that AVs be 100% error free... not just slightly better than human drivers or half as bad as human drivers... (like say 20,000 deaths a year), but no, AVs have to be nearly perfect. 99% perfect would still equate to 400 deaths a year. Will that be good enough? I doubt it. And certainly 90% better than humans would be 4000 deaths a year... no way that would be considered acceptable. But hey, 40,000 deaths by errant humans is "just an accident," right?
Congratulations, I think that may be the most strawman arguments and made-up statistics I have ever seen in a small paragraph. I'm pretty sure no one is satisfied with 40,000 deaths as being acceptable, we just right now don't accept that AI control is capable of reducing that number. You're acting like there's overwhelming evidence for that acceptance of AI control, and there just isn't.

We haven't gotten to the point where AV have demonstrated basic competence, let alone improving safety, and the issue for us as cyclists is that we will be dealing with a mix of AV and driver/operator controlled. I'm pretty sure none of us are intending to ride our AI controlled bicycles. (I'm also pretty sure that for the foreseeable future, the vast majority of motor vehicles are going to be driver controlled, so your "instant reduction" scenario is absurd on its face). My understanding is that AI is having trouble identifying cyclists and also generally anticipating the likely actions of non-AI operators--we don't function according to computer-generated algorithms.
livedarklions is offline  
Old 09-21-22, 08:36 AM
  #10  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by livedarklions
Congratulations, I think that may be the most strawman arguments and made-up statistics I have ever seen in a small paragraph. I'm pretty sure no one is satisfied with 40,000 deaths as being acceptable, we just right now don't accept that AI control is capable of reducing that number. You're acting like there's overwhelming evidence for that acceptance of AI control, and there just isn't.

We haven't gotten to the point where AV have demonstrated basic competence, let alone improving safety, and the issue for us as cyclists is that we will be dealing with a mix of AV and driver/operator controlled. I'm pretty sure none of us are intending to ride our AI controlled bicycles. (I'm also pretty sure that for the foreseeable future, the vast majority of motor vehicles are going to be driver controlled, so your "instant reduction" scenario is absurd on its face). My understanding is that AI is having trouble identifying cyclists and also generally anticipating the likely actions of non-AI operators--we don't function according to computer-generated algorithms.
OK. Let's make some assumptions to test my THEORY.

Let's say we finally get the AI nearly right and it can do Level 5 Self Driving... at what point is it acceptable as a replacement for human drivers?
  1. Would you accept that AVs (with AI) are superior to human drivers if the number of traffic deaths was cut in half... to say merely 20,000 a year?
  2. How about if AVs were 90% perfect, and the number of traffic deaths was reduced to a mere 4000 a year?
  3. Or do you demand 99% perfection of AI, and only 400 deaths a year.
  4. Or will you only accept 100% perfection of AI... and no traffic deaths?
So what is "good enough?" 1, 2, 3 or only 4... and why? Considering that right now we "allow" for 40,000 deaths by drivers on our highways each year?
genec is offline  
Old 09-21-22, 08:52 AM
  #11  
livedarklions
Tragically Ignorant
 
livedarklions's Avatar
 
Join Date: Jun 2018
Location: New England
Posts: 15,613

Bikes: Serotta Atlanta; 1994 Specialized Allez Pro; Giant OCR A1; SOMA Double Cross Disc; 2022 Allez Elite mit der SRAM

Mentioned: 62 Post(s)
Tagged: 0 Thread(s)
Quoted: 8186 Post(s)
Liked 9,094 Times in 5,053 Posts
Originally Posted by genec
OK. Let's make some assumptions to test my THEORY.

Let's say we finally get the AI nearly right and it can do Level 5 Self Driving... at what point is it acceptable as a replacement for human drivers?
  1. Would you accept that AVs (with AI) are superior to human drivers if the number of traffic deaths was cut in half... to say merely 20,000 a year?
  2. How about if AVs were 90% perfect, and the number of traffic deaths was reduced to a mere 4000 a year?
  3. Or do you demand 99% perfection of AI, and only 400 deaths a year.
  4. Or will you only accept 100% perfection of AI... and no traffic deaths?
So what is "good enough?" 1, 2, 3 or only 4... and why? Considering that right now we "allow" for 40,000 deaths by drivers on our highways each year?

If it actually made people safer (not just in terms of numbers of deaths, btw), then of course it should be adopted. You made up all that crap about demanding 99% perfection (whatever the hell that means) out of your own head--that's your strawman. No idea why a 99% reduction in deaths translates to 99% perfection, but that's your inane terminology, not mine.

All of this is irrelevant, because there's absolutely no reason to assume your numbers have anything to do with reality. And of course the 50% reduction in deaths would be a good thing, who besides you said it wouldn't be?

So tell me, what is your evidence for the proposition that people are demanding "100% perfection"? That's the kind of strawman people who can't defend the current state of the technology as currently being safe for road use would throw out--it's a "look at the birdie" distraction from you having no real argument.. You didn't really think I was going to say that the number of deaths shouldn't be reduced, did you?
livedarklions is offline  
Old 09-21-22, 09:05 AM
  #12  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by livedarklions
If it actually made people safer (not just in terms of numbers of deaths, btw), then of course it should be adopted. You made up all that crap about demanding 99% perfection (whatever the hell that means) out of your own head--that's your strawman. No idea why a 99% reduction in deaths translates to 99% perfection, but that's your inane terminology, not mine.

All of this is irrelevant, because there's absolutely no reason to assume your numbers have anything to do with reality. And of course the 50% reduction in deaths would be a good thing, who besides you said it wouldn't be?

So tell me, what is your evidence for the proposition that people are demanding "100% perfection"? That's the kind of strawman people who can't defend the current state of the technology as currently being safe for road use would throw out--it's a "look at the birdie" distraction from you having no real argument.. You didn't really think I was going to say that the number of deaths shouldn't be reduced, did you?
I have no "evidence" per se, other than the comments I read about people saying AVs won't work... and their citations of "it won't be perfect." I have a THEORY... I am testing that theory.

So, again, per my earlier email... at what point will you accept AI as the driver? At what level of deaths on the roadways would you find it acceptable to prefer robot drivers over human drivers?
(I am betting you will again avoid a direct answer... )

As far as "making people safer" as you suggest... you mean making better drivers??? Gosh, has that EVER worked?
genec is offline  
Old 09-21-22, 10:11 AM
  #13  
livedarklions
Tragically Ignorant
 
livedarklions's Avatar
 
Join Date: Jun 2018
Location: New England
Posts: 15,613

Bikes: Serotta Atlanta; 1994 Specialized Allez Pro; Giant OCR A1; SOMA Double Cross Disc; 2022 Allez Elite mit der SRAM

Mentioned: 62 Post(s)
Tagged: 0 Thread(s)
Quoted: 8186 Post(s)
Liked 9,094 Times in 5,053 Posts
Originally Posted by genec
I have no "evidence" per se, other than the comments I read about people saying AVs won't work... and their citations of "it won't be perfect." I have a THEORY... I am testing that theory.

So, again, per my earlier email... at what point will you accept AI as the driver? At what level of deaths on the roadways would you find it acceptable to prefer robot drivers over human drivers?
(I am betting you will again avoid a direct answer... )

As far as "making people safer" as you suggest... you mean making better drivers??? Gosh, has that EVER worked?
I did answer your question directly. Obviously if I'm agreeing to a 50% reduction being a good thing, number 1 would be a good thing. 2-4 are completely pointless questions, and really strawman positions. My point was that it was a stupid question because I have never seen anyone claim they needed perfection or 90% reduction in road deaths before it could be adopted. Hell, my standard for adoption is actually lower than your #1 because if you're asking me personally, I would accept AI control at the point it was as safe or any better than human drivers. If you want a precise number on it , I'm not claiming to be any kind of authority, so I don't know if 40,000 road deaths is the right measure (I suspect it's a hell of a lot more nuanced than that--highway AI requirements are likely completely different from urban street driving, for example). As far as "preferring" it, that's a rather academic question as even when it's going to be adopted (if ever), there will be many, many driver controlled vehicles on the road, bicycles among them.

I don't see anyone claiming that human drivers are perfect, or that AI drivers need to be perfect before they can replace human drivers. I have accused you of making that up, and I notice that your response to this accusation is to falsely claim I didn't answer your lame attempt at a loaded question.
livedarklions is offline  
Old 09-21-22, 10:52 AM
  #14  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by livedarklions
I did answer your question directly. Obviously if I'm agreeing to a 50% reduction being a good thing, number 1 would be a good thing. 2-4 are completely pointless questions, and really strawman positions. My point was that it was a stupid question because I have never seen anyone claim they needed perfection or 90% reduction in road deaths before it could be adopted. Hell, my standard for adoption is actually lower than your #1 because if you're asking me personally, I would accept AI control at the point it was as safe or any better than human drivers. If you want a precise number on it , I'm not claiming to be any kind of authority, so I don't know if 40,000 road deaths is the right measure (I suspect it's a hell of a lot more nuanced than that--highway AI requirements are likely completely different from urban street driving, for example). As far as "preferring" it, that's a rather academic question as even when it's going to be adopted (if ever), there will be many, many driver controlled vehicles on the road, bicycles among them.

I don't see anyone claiming that human drivers are perfect, or that AI drivers need to be perfect before they can replace human drivers. I have accused you of making that up, and I notice that your response to this accusation is to falsely claim I didn't answer your lame attempt at a loaded question.
Perhaps you need to simply read more...

First, we need to understand the safety dilemma that self-driving cars introduce. Currently, there are around 35,000 deaths as a result of motor vehicle crashes in the United States each year. That’s close to 3,000 people every month, or 100 people per day. Hypothetically, if there was a new technology that could reduce that fatality rate by just 1 percent – just 1 person a day – that could save 350 lives annually. Since the majority of car accidents are attributable to human error, and autonomous vehicles can reduce the error rate to close to zero, we can assume that autonomous vehicles can sharply reduce the overall fatality rate of motor vehicle accidents (and reduce total motor vehicle accidents as well).

However, even a single car accident can cause major damage and multiple deaths, and a single incident of major negative publicity casts doubt on the safety and efficacy of self-driving cars in general. If a handful of magnified cases lead the public to believe that self-driving cars are dangerous, we could face delays of autonomous cars for years to come – ultimately resulting in more lives lost.
https://readwrite.com/is-the-public-...car-accidents/

IDTechEx believes with current developments we could see autonomous vehicles matching or exceeding human safety levels by as soon as 2024. If growth is sustained, the report suggests by 2046 self-driving vehicles could meet the total mobility demand of the US – 3 trillion miles per annum. Then by 2050 autonomous vehicles could theoretically meet the entire transport needs of the world, with less than one accident per year. At this point, IDTechEx expects in many countries human driving on public roads will be outlawed in order to prevent injury, accidents and interference with self-driving vehicles that are all communicating amongst themselves.
https://insideevs.com/news/531094/hu...outlawed-2050/

The results show that the respondents believe that self-driving vehicles should be four to five times as safe as human driven vehicles. Current global traffic fatal risk is estimated at 17.4 per 100,000, which is 350 times greater than the frequency accepted by 50 percent of the respondents for self-driving vehicles. This implies that respondents expect these new vehicles to improve safety by two orders of magnitude against the current traffic risk.

Based on the results, the researchers propose the following requirements for self-driving vehicles based on the tolerability of risk in industrial safety (a concept developed in the health and safety field) in which risks are distinguished by three criteria: unacceptable, tolerable and broadly acceptable.

Self-driving vehicles that are less safe than human drivers would be set as the unacceptable risk criterion. The tolerable risk is that self-driving vehicles be four to five times as safe, meaning they should be able to reduce 75-80 percent of current traffic fatalities. The broadly acceptable risk criterion for self-driving vehicles is set as two orders of magnitude lower than current global traffic risk, indicating a hundredfold improvement over current traffic risks, or the same order of magnitude experienced in public transportation modes, such as rail and commercial aviation.
https://www.insurancejournal.com/new.../31/490751.htm

As for my THEORY...
Even assuming that technology is flawless, Garg believes proponents still face a “trust gap” with consumers that needs to be overcome before they will fully embrace driverless technology. AIG believes that consumers will have a say in how fast autonomous vehicles penetrate the market.
Flawless, Eh? Or what is "good enough?"

Last edited by genec; 09-21-22 at 10:56 AM.
genec is offline  
Old 09-21-22, 11:23 AM
  #15  
Bald Paul
Senior Member
 
Bald Paul's Avatar
 
Join Date: Sep 2017
Location: Upstate SC
Posts: 1,680
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 806 Post(s)
Liked 1,612 Times in 764 Posts
Even the best technology is subject to failure, and until the electronics are 100% reliable (they never will be, IMHO) the 'self driving car' cannot exist. Besides, humans are still responsible for maintaining their vehicles, and believe me, I've seen some cars being driven on the road that were poorly, if ever, maintained.

Bald Paul is offline  
Old 09-21-22, 11:52 AM
  #16  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by Bald Paul
Even the best technology is subject to failure, and until the electronics are 100% reliable (they never will be, IMHO) the 'self driving car' cannot exist. Besides, humans are still responsible for maintaining their vehicles, and believe me, I've seen some cars being driven on the road that were poorly, if ever, maintained.

100% reliable eh? Don't tell this to livedarklions Hey says he doesn't see anyone claiming it has to be perfect.

Frankly I doubt the technology will ever be 100% perfect, but can we accept a significant reduction in human deaths on the roadways?
genec is offline  
Old 09-21-22, 12:46 PM
  #17  
Trakhak
Senior Member
 
Trakhak's Avatar
 
Join Date: Jan 2005
Location: Baltimore, MD
Posts: 5,337
Mentioned: 15 Post(s)
Tagged: 0 Thread(s)
Quoted: 2428 Post(s)
Liked 2,880 Times in 1,645 Posts
It's a shame that the quasi-AV enthusiasts buying Teslas and the like are part of the population of drivers who have been causing 40,000 road deaths a year. And if they're spending all that money to buy a car with eight cameras and driver-assist software, they're sure as hell not going to pay even as much attention as they were when they were driving their dumb cars, which wasn't all that much, what with having to text and apply makeup.

Expecting those drivers to be paying attention when the AV gets confused and is about to plow into an emergency vehicle or a motorcycle or a bicycle is like expecting the people you see on bikes with helmets dangling from their handlebars to put the helmet on just before a crash.
Trakhak is offline  
Likes For Trakhak:
Old 09-21-22, 01:21 PM
  #18  
I-Like-To-Bike
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,950

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,517 Times in 1,031 Posts
Originally Posted by genec
100% reliable eh? Don't tell this to livedarklions Hey says he doesn't see anyone claiming it has to be perfect.

Frankly I doubt the technology will ever be 100% perfect, but can we accept a significant reduction in human deaths on the roadways?
Sure "we" can accept a significant reduction in human deaths on the roadways. You got any? - besides a handful of salesmen/promoters/technophiles (and their presumably hired researchers) speculating and guessing about what might happen at some future date with some currently non-existing vehicles with some currently unavailable/non-existing combination of reliable,effective, allweather self driving software/hardware.

Of course there are dreamy predictions coming from promoters promoting a rationale for the dreamy technology they are trying to sell to investors and speculators.
I-Like-To-Bike is offline  
Old 09-21-22, 01:30 PM
  #19  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by I-Like-To-Bike
Sure "we" can accept a significant reduction in human deaths on the roadways. You got any? - besides a handful of salesmen/promoters/technophiles (and their presumably hired researchers) speculating and guessing about what might happen at some future date with some currently non-existing vehicles with some currently unavailable/non-existing combination of reliable,effective, allweather self driving software/hardware.

Of course there are dreamy predictions coming from promoters promoting a rationale for the dreamy technology they are trying to sell to investors and speculators.
The technology doesn't exist yet... so no, I cannot tell you of any such statistics. But hey, we cannot tell you how many cyclists are injured by motor vehicles and how, either, because that data isn't collected or reported. (death stats are, injury stats are not)

But the question that was being discussed is: "How much of an reduction in deaths would it take to make AV technology more acceptable than human drivers?"

Care do delve into that?
Or are you just going to dismiss the hypothetical question based on your perception that such technology will never exist?
genec is offline  
Old 09-21-22, 01:40 PM
  #20  
I-Like-To-Bike
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,950

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,517 Times in 1,031 Posts
Originally Posted by Trakhak
Expecting those drivers to be paying attention when the AV gets confused and is about to plow into an emergency vehicle or a motorcycle or a bicycle is like expecting the people you see on bikes with helmets dangling from their handlebars to put the helmet on just before a crash.
Drivers and passengers won't need to pay attention in level 4 and 5 AV self-driving vehicles being promised/ promoted by the usual suspects.
Unfortunately despite the usual promises of the usual suspects, reliable unmonitored Level 4 and Level 5 vehicles that can operate safely in typical traffic conditions are not to be found.

Even more unfortunately, unreliable Level 2 so-called "beta-self-driving" vehicles which do require constant driver attention to be ready to intercede in case of the test software/hardware failure or confusion, are being tested on public roads by enthusiastic technophiles.
I-Like-To-Bike is offline  
Likes For I-Like-To-Bike:
Old 09-21-22, 01:48 PM
  #21  
I-Like-To-Bike
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,950

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,517 Times in 1,031 Posts
Originally Posted by genec
The technology doesn't exist yet... so no, I cannot tell you of any such statistics. But hey, we cannot tell you how many cyclists are injured by motor vehicles and how, either, because that data isn't collected or reported. (death stats are, injury stats are not)

But the question that was being discussed is: "How much of an reduction in deaths would it take to make AV technology more acceptable than human drivers?"

Care do delve into that?
Or are you just going to dismiss the hypothetical question based on your perception that such technology will never exist?
Nobody but YOU made statements that such technology "will never exist". It doesn't exist now and delving into future safety record predictions based on when and if they ever do are similar to discussing the safety record of flying cars and comparing them to ground based personal transportation.
I-Like-To-Bike is offline  
Likes For I-Like-To-Bike:
Old 09-21-22, 01:57 PM
  #22  
Bald Paul
Senior Member
 
Bald Paul's Avatar
 
Join Date: Sep 2017
Location: Upstate SC
Posts: 1,680
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 806 Post(s)
Liked 1,612 Times in 764 Posts
Originally Posted by genec
Frankly I doubt the technology will ever be 100% perfect, but can we accept a significant reduction in human deaths on the roadways?
Absolutely. I'm all for advancements in safety engineering in autos. My wife's car has adaptive cruise control, auto braking, lane departure warning, back up cross traffic alarm/braking - all good stuff, and I know it helps make it safer for her to drive, as long as there are no malfunctions and the limitations of the systems are understood. For instance, lane departure warning systems based on cameras rely heavily on a white line along the shoulder of the roadway. It doesn't work well if the road is covered in a fresh layer of snow. I believe that the driver is still ultimately responsible for control of the vehicle. I've had to deal with too many owners who think that, because their car has all these electronic safety assist technologies, they won't get in an accident. I have had to explain to them that you can't overcome the laws of physics. Just because your car has stability control doesn't mean you can go into a 20 MPH corner at 60 MPH and come out unscathed.
Bald Paul is offline  
Likes For Bald Paul:
Old 09-21-22, 07:30 PM
  #23  
jon c. 
Senior Member
 
Join Date: Mar 2012
Location: Tallahassee, FL
Posts: 4,812
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 1591 Post(s)
Likes: 0
Liked 1,015 Times in 570 Posts
Originally Posted by flangehead
[Limited scope: I want to try something here to see if it can catch on in A&S. I'd like to limit this thread, voluntarily, to ideas about actions that cyclists who have no choice but to ride on the street at night can take to reduce their risk due to this threat. Objective is to make this A&S thread more useful.]
Nice idea, but not likely to succeed.

I don't think the actions we can take are any different than the actions we've always taken to help ensure drivers see us. Whether it's a machine or a human, there's only so much we can do. Good lighting and proper positioning. It will be a long time before these vehicles are common and in the intervening period the systems they use to 'see' us will change. Once there are standards, we may learn specific ways to be more visible to such systems but I suspect that will be some years into the future.
jon c. is offline  
Likes For jon c.:
Old 09-22-22, 05:31 AM
  #24  
livedarklions
Tragically Ignorant
 
livedarklions's Avatar
 
Join Date: Jun 2018
Location: New England
Posts: 15,613

Bikes: Serotta Atlanta; 1994 Specialized Allez Pro; Giant OCR A1; SOMA Double Cross Disc; 2022 Allez Elite mit der SRAM

Mentioned: 62 Post(s)
Tagged: 0 Thread(s)
Quoted: 8186 Post(s)
Liked 9,094 Times in 5,053 Posts
Originally Posted by genec

Boy, that's some pretty crappy sourcing. Two people speculating about a hypothetical resistance, and assertions about how good the technology might be in the future. And then you quote stuff out of context to make it say something it doesn't. Good stuff.

You also went wildly off the topic of the thread when you posted your " theory," btw.
livedarklions is offline  
Likes For livedarklions:
Old 09-22-22, 05:39 AM
  #25  
livedarklions
Tragically Ignorant
 
livedarklions's Avatar
 
Join Date: Jun 2018
Location: New England
Posts: 15,613

Bikes: Serotta Atlanta; 1994 Specialized Allez Pro; Giant OCR A1; SOMA Double Cross Disc; 2022 Allez Elite mit der SRAM

Mentioned: 62 Post(s)
Tagged: 0 Thread(s)
Quoted: 8186 Post(s)
Liked 9,094 Times in 5,053 Posts
Originally Posted by genec

But the question that was being discussed is: "How much of an reduction in deaths would it take to make AV technology more acceptable than human drivers?"
No it wasn't, until you hijacked the thread. The OP specifically asked to limit the thread to how best to avoid being hit by a Tesla on autopilot.

I think we all should stop indulging your weird Musk fanboy sophistry.
livedarklions is offline  

Thread Tools
Search this Thread

Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service -

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.