The authors argue that the presence—or perceived presence—of certain key capacities could make people more likely to hold a machine morally responsible.Those capacities include autonomy, the ability to act without human input."When these robots inevitably do something to harm humans, how will people react?
And when it’s someone else’s problem, no one takes action to solve it.
On the other hand, acknowledging that one is ultimately responsible for the results of one’s life, thoughts and actions creates a level of freedom not experienced by those who blame others — and it empowers that person to act.
Removing fear and establishing a take-responsibility culture begins with the leaders.
But for the followers to adopt their leader’s fearless attitude, it’s imperative they understand their leaders are on their side and want them to win, as well as that nothing less than their highest degree of execution and performance will be acceptable. A leader can’t expect followers to change their attitudes while he stays mired in the old blame-based thinking.
Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission.
The content is provided for information purposes only.Meanwhile, Sally in accounting emails, “The client’s invoice was wrong because of our miscalculation.We’ve called and apologized.” And so it goes with every employee, in every department … Instead of facing the workday with excitement, most leaders want to crawl back under the covers from sheer dread of what actually awaits them at the office: excuse making, blame shifting and responsibility dodge ball.Once those fears have been identified, the leader needs to figure out which behaviors to change in order to set a better example.And if (actually, when) the leader screws up, he should set a good example and “own it.” Overall, the rewards of being a fearless leader will far outweigh the consequences.But, in an article published on April 5 in the journal Trends in Cognitive Sciences, cognitive and computer scientists ask at what point people will begin to hold self-driven vehicles or other robots responsible for their own actions—and whether blaming them for wrongdoing will be justified."We're on the verge of a technological and social revolution in which autonomous machines will replace humans in the workplace, on the roads, and in our homes," says Yochanan Bigman of the University of North Carolina, Chapel Hill.And those decisions will help to shape a future in which people may increasingly coexist with ever more sophisticated, decision-making machines.More information: Trends in Cognitive Sciences, Bigman et al.: "Holding Robots Responsible: The Elements of Machine Morality" https://com/trends/cognitive-sciences/fulltext/S1364-6613(19)30063-4 , DOI: 10.1016/20 Citation: When robots commit wrongdoing, people may incorrectly assign the blame (2019, April 5) retrieved 7 September 2019 from https://medicalxpress.com/news/2019-04This document is subject to copyright.As the technology continues to advance, there will be other intriguing questions to consider, including whether robots should have rights.Already, the authors note, the American Society for the Prevention of Cruelty to Robots and a 2017 European Union report have argued for extending certain moral protections to machines.