Old 09-12-22, 03:03 PM
  #4  
noimagination
Senior Member
 
Join Date: Oct 2016
Posts: 728
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 365 Post(s)
Liked 419 Times in 248 Posts
I'm no technophile myself. It is obvious that any bug needs to be addressed quickly, throughout the lifecycle of the product but particularly at the development stage. during early alpha/beta testing and during early actual use. I agree that current laws and regulations are not adequate, and it would seem obvious that features taking decision making out of the hands of a human driver should not be implemented until adequate laws and regulations regarding how these "features" are developed , including what types of road "hazards" must be detected and under what conditions/circumstances; what other performance standards they need to meet; what post-market data collection and analysis is required to detect possible bugs; what standards for protection against malware must be implemented; etc. Industry as a whole, and Tesla in particular, have amply demonstrated through their actions over many years that they cannot be trusted to develop safe products without oversight.

However, given the abysmal performance of many drivers, one has to ask how much worse autonomous vehicles will be than human drivers. Right now I'd agree that autonomous systems need development, that the companies that develop them need to be watched closely and that such companies need to be called to account when their products are obviously, or even potentially, deficient. The stakes are very high, particularly for vulnerable road users like bikes, motorcycles, pedestrians...
noimagination is offline