According to philosophical perspectives, machines have ethics and morals from how they are designed to work in various fields. Engineers have imbued autonomous systems with a sense of ethics since robots can tell what is wrong or right as per the working procedures. The use of Artificial Intelligence (AI) applications has aided in making machines with a sense of morals, as discussed by philosophers such as Jeremy Bentham, Immanuel Kant, Michael Tooley, and Marry Anne Warren (Evans, 2020). This paper discusses the ethical view of machine’s moral standards by analyzing whether or not machines can have moral standing from Immanuel Kant’s philosophical perspective.
There are three laws pertaining machines, namely: a robot may not injure human beings or allow them to be harmed during the process or work. Additionally, robots obey orders given by human beings, and robots protect their existence if the protection does not conflict with the first and second laws (Evans, 2020). Based on the laws, the discipline that robots have in working can be related to being ethically moral. Immanuel Kant argues that to have moral standing, a person must act according to the self-prescribed rules that are morally right as per them (Evans, 2020). Just like human beings have to follow their gut feelings, the same applies with machines having the principle of moral laws that they reflect towards human beings.
In the film, Andrew is portrayed to have self-prescribed moral conduct. However, the perspective expresses that that may be hard to achieve in the real world because of the nature of changing industrial uses that are too demanding (Evans, 2020). Therefore, it could be wiser to conclude that machines have moral standings because they follow the commands and do not design their ways in the separate productive line installed. However, the machines may not get moral reciprocation from users because they tend to be configured so that users can tamper in many cases.
Reference
Evans, M. (2020). Ethics and the bicentennial man Anderson [Video]. GNED.