Grady Booch, an IBM fellow and inventor of UML,
is using his position to preach the gospel of moral responsibility to software developers. During an interview with Charles Cooper at Cnet,
he says that it's not simply a question of whether or not something can be built, but whether it should be built.
Good point, so far as it goes. And it doesn't go very far.
He uses the standard analogy of nuclear power and nuclear weapons to remind us of technologies that may not be morally neutral, although he doesn't commit himself to the view that they're morally bad. He only commits himself to the possibility that their moral status is open to question. Of course, it is. So is the moral status of every piece of technology. A nuclear weapon may throw the issue into high relief, but it does not tell me that the inventors or builders of a nuclear weapon are morally blameworthy. If he thinks they are, he should say so.
According to Wade Jolson
, Booch referenced consulting at the Department of Defense during a speech in September. Apparently Booch doesn't stop himself from working for the war industry. One wonders, then, where he would set his personal limits.
And according to Booch, it is an individual's responsibility to form the questions and the answers on these topics. He told Cooper that "at the ultimate level, the software developer can say, 'Do I want to actually build a system that potentially could violate human rights?'" What doesn't "potentially" do harm? Too many marshmellows may lead to diabetes, even if they are fluffy and delicious.
A reasonable person might think it's okay to write software that controls the video camera at the convenience store, which gives the clerk a false sense of safety. But when the same software is used to track individuals on public streets, is it okay? Booch raises the question -- which I don't deny is an interesting one -- but gives no hint of the answer. Nor does he tell us whether the original developer should have foreseen the problematic usage of the software.
Booch jumps from technologies to individuals without stopping at the organizations or collections of people who commission the projects he questions. Why is it each developer for him or herself? Why is there no guidance as to principles that individuals and organizations should follow? Are there any clear cases of immoral software? Any clear cases of software that can't be used to harm others?
Booch has the visibility to get people thinking. Can he use it for good?