Categories
Philosophy

Could we have ethical responsibilities to non biologically-coherent beings, such as networks, biospheres, artificial intelligences or other seemingly-spontaneously acting/intending communicative beings?

A) Could we have ethical responsibilities to non biologically-coherent beings, such as networks, biospheres, artificial intelligences or other seemingly-spontaneously acting/intending communicative beings? Would they be the same responsibilities as to coherent living things? How do they differ, if not?
B) Explain whether we have an ethical obligation to forgive ethical harm done to us, using at least one of the ethical theories. Under what circumstances is forgiveness required, if ever? Is it a supererogatory (ethically good but beyond requirement) obligation? Does it differ if we forgive harm done to others?
C) Under what circumstances is violence acceptable to defend abstract ideals and/or to preserve moral agents that do not directly have a relationship with you? Is violence acceptable to defend moral patients (person-like beings which lack the ability to act on agency)? You will want to include a NON DICTIONARY definition of violence in addition to an explanation of the system of thought you are utilizing.
D) Do we have an ethical responsibility to be healthy? Is it possible to have an obligation to health? Consider the definition of health, as well as how defining health can be exclusionary to those who are already culturally/socially marginalized. If that obligation causes harm to persons as agents, what arguments could make it nevertheless an ethical good? NOTE: You may NOT use body size or HIV/other STIs as an example in this essay.
E) It is…sometime in the future and you are a judge in a case where a human-like robot is suing their owner for recognition as an employee rather than being considered a tool, claiming they hold all of the same qualities, desires, functions and emotions that a human does and therefore they are more than mere property and should be granted personhood in the eyes of the law. Will you hear this case? Does the robot have sufficient personhood to argue this?