Old 09-12-22, 03:32 PM
  #5  
genec
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by noimagination
I'm no technophile myself. It is obvious that any bug needs to be addressed quickly, throughout the lifecycle of the product but particularly at the development stage. during early alpha/beta testing and during early actual use. I agree that current laws and regulations are not adequate, and it would seem obvious that features taking decision making out of the hands of a human driver should not be implemented until adequate laws and regulations regarding how these "features" are developed , including what types of road "hazards" must be detected and under what conditions/circumstances; what other performance standards they need to meet; what post-market data collection and analysis is required to detect possible bugs; what standards for protection against malware must be implemented; etc. Industry as a whole, and Tesla in particular, have amply demonstrated through their actions over many years that they cannot be trusted to develop safe products without oversight.

However, given the abysmal performance of many drivers, one has to ask how much worse autonomous vehicles will be than human drivers. Right now I'd agree that autonomous systems need development, that the companies that develop them need to be watched closely and that such companies need to be called to account when their products are obviously, or even potentially, deficient. The stakes are very high, particularly for vulnerable road users like bikes, motorcycles, pedestrians...
For some odd reason, we give a pass to drivers that kill others... And this happens roughly 40,000 times a year. Oh sure, some drivers are arrested and convicted... but that generally occurs only in cases of alcohol or drug abuse associated with those drivers.

On the other hand, we now seem to demand that AVs be 100% error free... not just slightly better than human drivers or half as bad as human drivers... (like say 20,000 deaths a year), but no, AVs have to be nearly perfect. 99% perfect would still equate to 400 deaths a year. Will that be good enough? I doubt it. And certainly 90% better than humans would be 4000 deaths a year... no way that would be considered acceptable. But hey, 40,000 deaths by errant humans is "just an accident," right?
genec is offline  
Likes For genec: