You are on page 1of 5

CHAPTER 2: RESPONSIBILITY IN ENGINEERING OUTLINE

OBLIGATION-RESPONSIBILITY BLAME-RESPONSIBILITY ROLE-RESPONSIBILITY

All raise a common question: What is it to behave responsibly. The authors suggested we accept a reasonable care model. One should in a given case exercise and apply his professional skill and ability, judgment and taste reasonably and without neglect. Where doing this typically involves more than simply abiding by the standard operating procedures and regulations that apply to a profession or fulfilling the basic responsibilities of a job as defined by its terms of employment. Generally, we also want to prioritize the obligations falling out of this: Do No Harm Principle: "Other things being equal, one should exercise due care to avoid contributing to significantly harming others." Proportionate Care Principle: "When one is in a position to contribute to greater harm or when one is in a position to play a more critical part in producing harm than is another person, one must exercise greater care to avoid doing so." Lets separate the forest for the trees here: General Idea: Responsible action is analyzed as action in accordance with standards of reasonable care where thats something other than merely mechanical rule following. This doesnt tell us too much yet. But it tells us some things. It tells us that to act responsibly is to show deference for others basic rights, as opposed to doing good works, and it tells us that the best way of doing this is by cultivating virtuous habits of action (e.g., honesty, reliability) grounded in virtuous habits of sentiment and thought (generosity, compassion).
1

Given that this, in broad outline, is what responsibility is, what are the impediments to responsibility? This is what we were talking about at the end of the last session. Remember we were looking at case study presented by the Columbia disaster last time: The Columbia Accident Investigation Board identified three types of explanations of the accident in their effort to pin down blame-responsibility:
1. Physical cause, 2. Organizational causes, and 3. Individuals responsible or accountable for the accident.

The organizational causes of this accident are rooted in the Space Shuttle Program's history and culture, including the original compromises that were required to gain approval for the Shuttle, subsequent years of resource constraints, fluctuating priorities, schedule pressures, mischaracterization of the Shuttle as operational rather than developmental, and lack of an agreed national vision for human space flight. Cultural traits and organizational practices detrimental to safety were allowed to develop, including: reliance on past successes as a substitute for sound engineering practices (such as testing to understand why systems were not performing in accordance with requirements); organizational barriers that prevented effective communication of critical safety information and stifled professional differences of opinion; lack of integrated management across program elements; and the evolution of an informal chain of command and decision-making processes that operated outside the organization's rules." What this raises is what the authors call The Problem of Many Hands and the subsequent problem of fractured responsibility. Whats that? "The Flight Readiness process, which is built on consensus, verified by the signatures of all responsible parties, in effect renders no one accountable." What does that mean? So lets talk about why? Impediments to Responsible Action What attitudes and frames of mind can contribute to irresponsible action. Self-Interest:

Fear: of acknowledging our mistakes, of losing our jobs. Whistleblowers are not really encouraged in this society. The Columbia Accident Investigation Board observed that "fear of retribution" can be a factor inhibiting the expression of minority opinions. Self-Deception: Main mechanism for doing this is the normalization of deviance. Whats that? Thats a practice of adjustment wherein the boundaries of acceptable risk are enlarged without a sound engineering basis. The board observed: With each successful landing, it appears that NASA engineers and managers increasingly regarded the foam-shredding as inevitable, and as either unlikely to jeopardize safety or simply an acceptable risk. Accompanying shifting of the burden of proof. Columbia Accident Investigation Board remarked: In the face of Mission managers' low level of concern and desire to get on with the mission, Debris Assessment Team members had to prove unequivocally that a safety-of-flight issue existed before Shuttle Program management would move to obtain images of the left wing. The engineers found themselves in the unusual position of having to prove that the situation was unsafea reversal of the usual requirement to prove that a situation is safe. As the Board observed, "Imagine the difference if any Shuttle manager had simply asked, Prove to me that Columbia has not been harmed.'" Ignorance: According to the Columbia Accident Investigation Board, there was a kind of "cultural fence" between engineers and managers. This resulted in high-level managerial decisions that were based on insufficient knowledge of the facts. Microscopic Vision: Over technical specification or excessive emphasis on ones own distinctive role.

Uncritical Acceptance of Authority: The Columbia Accident Investigation Board cites organizations in which dissent is encouraged, including the U.S. Navy Submarine Flooding Prevention and Recovery program and the Naval Nuclear Propulsion programs. In these programs, managers have the responsibility not only of encouraging dissent, but of coming up with dissenting opinions themselves if such opinions are not offered by their subordinates. According to the board, "Program managers [at NASA] created huge barriers against dissenting opinions by stating preconceived conclusions based on subjective knowledge and experience, rather than on solid data. Toleration and encouragement of dissent, then, was noticeably absent in the NASA organization. If dissent is absent, then critical thinking is absent. So, this is all about what people in groups do wrong. They suffer from inordinate fear and uncritical acceptance of authority and all these other things. We can still ask why? What happens to the thinking of people in groups to explain these practices? Groupthink (Irving Janis)

an illusion of invulnerability of the group to failure;

a strong "we-feeling" that views outsiders as adversaries or enemies and encourages shared stereotypes of others;
rationalizations that tend to shift responsibility to others;

an illusion of morality that assumes the inherent morality of the group and thereby discourages careful examination of the moral implications of what the group is doing; a tendency of individual members toward self-censorship, resulting from a desire not to "rock the boat";

an illusion of unanimity, construing silence of a group member as consent;

an application of direct pressure on those who show signs of disagreement, often exercised by the group leader who intervenes in an effort to keep the group unified; and

mindguarding, or protecting the group from dissenting views by preventing their introduction (by, for example, outsiders who wish to present their views to the group). The Columbia Accident Investigation Board described an organizational culture where "people find it intimidating to contradict a leader's strategy or a group consensus," evidently finding this characteristic of the NASA organization.

You might also like