Q&A

Q&A

Premise: I assert that overall, people are good by nature, and when provided truthful and robust information, will more often than not make the moral choice.

Q. Is it the programmer’s obligation to point out where reliable safeguards to protect the human species should be built in programs?

A. Absolutely yes! The programmer is likely in the good position to spot such deficiencies in the software, and there is a chance the problems are not intentionally malicious but are merely design flaws. Raising the flag and pointing them out would likely lead to a design change. If they are intentional and can lead to harm, it is the programmer’s responsibility as a human being to not only point it out but work against its implementation. Either way I assert the programmer is obligated to point out the lack of a safeguard.

Q. In a work environment, do programmers really have/take the time to “imagine the implications of their programming?”

A. Career programmers take their work home with them all the time, and find themselves thinking through code and design in their heads even when off duty. I’ve solved problems plenty of times while daydreaming or even in dreams. If the work day is fast paced and hectic, the programmer things about their job during off hours, as do career workers in any profession, and in this space there is plenty of time to imagine the implications of ones work, will is the variable, not time.

Q. In the work environment is it possible for a programmer to raise questions of “morality” in the absence of laws that set boundaries?

A. Yes, by thoroughly detailing and articulating to your team and supervisor the ways in which people may be harmed by a particular design choice. The more people aware of the moral shortfall, the more likely the group will chose to alter the design and remedy the immoral design choice. One’s true moral disposition is revealed when operating anonymously, behind a bureaucracy, or in a legal void space. I fall back on the premise of this Q&A to assert that more often than not, people will make the moral choice when faced with accurate and complete information.

Q. Are certain programs based on their content immoral? Or, is it that the only realistic boundaries are on specified immoral uses of a program?

A. I think it can be said that the content of some programing is clearly intended for immoral use, or designed implicitly for immoral conduct. Script kiddies utilizing premade hacking tools is an example, or peer-to-peer programs used mainly for piracy of copyright material, are a couple of likely suspects that come to mind. The peer-to-peer file sharing programs are an interesting example, their EULA usually clearly state that by clicking agree you agree you will not use their software for copyright infringement activity, but the authors are fully aware of what type of content is predominately shared using their software. The EULA is clearly just to avoid legal accountability, the authors fully understand the immorality of their software.
End users are also accountable for what purposes built in capabilities of software are used, especially when these tools are hacked or used in an alternative more clandestine manner. Clearly, if software internals are nefarious and are proprietary the end user has little to no discretion, the onus resides mostly or not entirely with the software manufacturer.

Q. Does the fact that there may be an immoral use of a program mean the program should not be developed for a beneficial purpose, and if you only outlaw the immoral use, is it practical to think it will be enforced?

A. Software, like any other tool, may be abused and employed for immoral purposes. If a software is used in a way which it is not intended and that purpose is immoral, the user is primarily responsible for their immoral conduct. Once the immoral use becomes known to the manufacturer, they have an obligation to update their software to make it unusable for such pursuits. These are just overarching points, this question can only really be answered on a case by case basis.
Regarding enforcement, here is where there is a massive divide. Society has seen a tremendous transition from the analog world to the digital. Governments around the world have been pitiful in their efforts to ensure proper laws are put in place to protect their citizens. In fact, in far too many cases, government themselves have been found to be immorally using technology. Here is my key point… with so many gaps in the law due to the geometric growth we’re seeing in technology there is an unprecedented opportunity for people and organization to use technology immorally. In addition, the current laws legally permit activity both from within government and throughout society at large to conduct themselves in an immoral manner through the conduit of technology. The incredible burden of morality in this landscape of blurred lines inevitably falls back on the individual. So to answer the question, no, it is not practical to think it will be enforced, this is why personal morality is so desperately needed.

Q. When a program can identify and alter gene sequences that could lead to a cure for a disease, but could start a pandemic, as with certain recent bird flu research, should this work be pursued?

A. I would assume any program with such incredible risk was to be developed, every countermeasure to prevent the catastrophe would be employed. Such work should be extremely regulated and managed. I imagine such a program would not be allowed to take place if there was any potential for accidental exposure to the public. This extreme case reinforces what I’ve been proposing… there is a large amount of careful consideration, and moral judgment which needs to be performed, especially when new edge technologies are being developed and potential consequences can be quite large. So my answer is yes, this type of research should be conducted, if and only if proper safeguards are in place which ensure no harm will come to the public. If this condition cannot be satisfied, then no this type of research should not be permitted.

Leave a Reply