Tesla software is considered one of the most secure and light years ahead of what other carmakers install in their vehicles. Still, researchers found out...
We’re talking about an autonomous system controlling the steering, acceleration and braking. Hardly an apt comparison to “engine, brakes, etc.” Those things are components to the functioning of the overall system. The self driving stuff sits on top of that and needs to be able to identify issues with the prime mover, brakes and etc and to disengage if it’s unsafe. Allowing someone to jailbreak the self driving system and override safety shutdowns is a recipe for disaster.
I’m saying that the power to tinker with a car and destroy its ability to be safely operated is not a new technology thing. Getting the government involved to prevent people from tinkering with items they own is a severe overreaction to a “threat” that has existed as long as people have owned wrenches and cars.
That’s a great point. I guess my concerns are more from my bad experiences with computers randomly doing weird stuff and trusting my life to a system like that.
I can understand that. I have an OBD-II module that I’ve used on my EV to unlock certain stupid locked features (including larger gas tank capacity — it’s a PHEV), but I definitely didn’t want to touch anything that could cause, say, a computer crash while driving down the road. But I’ve also had tires blow out, headlights fall out, transmissions break, and engines seize over the years as well. There are plenty of mechanical things that anyone could do that would cause catastrophe on the roads. I just don’t wanna go overboard on the government involvement since I think we should be able to actually repair/tinker with/jailbreak whatever we own, especially when it costs tens of thousands of dollars.
We’re talking about an autonomous system controlling the steering, acceleration and braking. Hardly an apt comparison to “engine, brakes, etc.” Those things are components to the functioning of the overall system. The self driving stuff sits on top of that and needs to be able to identify issues with the prime mover, brakes and etc and to disengage if it’s unsafe. Allowing someone to jailbreak the self driving system and override safety shutdowns is a recipe for disaster.
I’m saying that the power to tinker with a car and destroy its ability to be safely operated is not a new technology thing. Getting the government involved to prevent people from tinkering with items they own is a severe overreaction to a “threat” that has existed as long as people have owned wrenches and cars.
That’s a great point. I guess my concerns are more from my bad experiences with computers randomly doing weird stuff and trusting my life to a system like that.
I can understand that. I have an OBD-II module that I’ve used on my EV to unlock certain stupid locked features (including larger gas tank capacity — it’s a PHEV), but I definitely didn’t want to touch anything that could cause, say, a computer crash while driving down the road. But I’ve also had tires blow out, headlights fall out, transmissions break, and engines seize over the years as well. There are plenty of mechanical things that anyone could do that would cause catastrophe on the roads. I just don’t wanna go overboard on the government involvement since I think we should be able to actually repair/tinker with/jailbreak whatever we own, especially when it costs tens of thousands of dollars.