The
autonomous cars are now a reality. These cars are only in test, but in her finally
version will be the safest. But the
self-driving cars have yet to pass many steps.
One of these
steps, probably the more problematic, is what does the car in case of accident.
The difference between a human and a robot is that a human do that they feel in
the moment, but the robot can calculate the possibility of injury and death. The
problem is that the car will do whatever it is scheduled to do.
What occurred
if the car death a person? Who is the guilty? The programmers, the enterprise,
the proprietary... And the most important think, if the car looks at the
obligation to shed an innocent or crash and kill the driver, what should it do?
Cap comentari:
Publica un comentari a l'entrada