- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Attacking AI based systems with malicious input is like shooting fish in a barrel. Combine high complexity with low comprehension of how the internals of the system actually work and you have a field day for security researchers.
It’s only a matter of time before someone hurts or robs a Tesla driver by forcing the car to do something that’s not in the best interest of the operator.
If self driving cars become disabled by putting a cone on the hood, maybe they aren’t ready to be on the streets.
By that measure they never will be. Because people are never going to stop fucking with them.
I took a ride in a self driving car. There were 3 different people who just walked right out in front of the car. One of them crossed the street then turned back around to walk back in front. People really enjoy fucking with them. This is why we can’t have nice things.
They won’t be ready to be on the streets until they’ve had a lot of time on the streets.
It’s a catch 22.
I wouldn’t expect a human driver to move if their view is obstructed or there are objects on the vehicle
Humans are capable of assessing and addressing the obstruction; meanwhile these cars are permanently disabled without outside assistance.